WorldWideScience

Sample records for hx-20 notebook computer

  1. Worksheets for computing recommended notebook computer and workstation adjustments.

    Science.gov (United States)

    Nanthavanij, Suebsak; Udomratana, Chatkate; Hansawad, Saowalak; Thepkanjana, Jayaporn; Tantasuwan, Wanchalerm

    2013-01-01

    This paper discusses the design and development of worksheets for helping notebook computer (NBC) users to compute NBC and workstation adjustments so as to assume an appropriate seated posture. The worksheets (one for male users, the other for female ones) require the following information: body height, NBC screen size, work surface height, and seat height. The worksheets contain tables for estimating recommended NBC base angle, NBC screen angle, body-NBC distance, work surface height, and seat height. Additionally, they include flow charts to help NBC users to determine necessary adjustment accessories and their settings.

  2. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    Science.gov (United States)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  3. Analysis of conditions and organization of work of notebook computer users.

    Science.gov (United States)

    Malińska, Marzena; Bugajska, Joanna; Kamińska, Joanna; Jędryka-Góral, Anna

    2012-01-01

    The aim of this study was to evaluate working conditions with a notebook computer (notebook) as a potential cause of musculoskeletal disorders. The study had 2 stages. The first one was a questionnaire survey among 300 notebook users. The next stage was an expert analysis of 53 randomly selected workstations. The questionnaire survey included questions about the participants, their working conditions, work organization and also duration of work with a notebook. The results of the research showed that most examined operators used a notebook as a basic working tool. The most important irregularities included an unadjustable working surface, unadjustable height of the seat pan and backrest, unadjustable height and distance between the armrests and no additional ergonomic devices (external keyboard, docking station, notebook stand or footstool).

  4. Ergonomic intervention for improving work postures during notebook computer operation.

    Science.gov (United States)

    Jamjumrus, Nuchrawee; Nanthavanij, Suebsak

    2008-06-01

    This paper discusses the application of analytical algorithms to determine necessary adjustments for operating notebook computers (NBCs) and workstations so that NBC users can assume correct work postures during NBC operation. Twenty-two NBC users (eleven males and eleven females) were asked to operate their NBCs according to their normal work practice. Photographs of their work postures were taken and analyzed using the Rapid Upper Limb Assessment (RULA) technique. The algorithms were then employed to determine recommended adjustments for their NBCs and workstations. After implementing the necessary adjustments, the NBC users were then re-seated at their workstations, and photographs of their work postures were re-taken, to perform the posture analysis. The results show that the NBC users' work postures are improved when their NBCs and workstations are adjusted according to the recommendations. The effectiveness of ergonomic intervention is verified both visually and objectively.

  5. University students' notebook computer use: lessons learned using e-diaries to report musculoskeletal discomfort.

    Science.gov (United States)

    Jacobs, K; Foley, G; Punnett, L; Hall, V; Gore, R; Brownson, E; Ansong, E; Markowitz, J; McKinnon, M; Steinberg, S; Ing, A; Wuest, Ellen; Dibiccari, Leah

    2011-02-01

    The objective of this pilot study was to identify if notebook accessories (ergonomic chair, desktop monitor and notebook riser) combined with a wireless keyboard, mouse and participatory ergonomics training would have the greatest impact on reducing self-reported upper extremity musculoskeletal discomfort in university students. In addition to pre-post computing and health surveys, the Ecological Momentary Assessment was used to capture change in discomfort over time using a personal digital assistant (PDA) as the e-diary. The PDA was programmed with a survey containing 45 questions. Four groups of university students were randomised to either intervention (three external computer accessories) or to control. Participants reported less discomfort with the ergonomic chair and notebook riser based on the pre-post survey data and the e-diary/PDA ANOVA analysis. However, the PDA data, adjusted for the effect of hours per day of computer use, showed no benefit of the chair and limited benefit from the riser. Statement of Relevance:University students' use of notebook computers has increased. This study found evidence of a positive effect of an adjustable chair or notebook riser when combined with ergonomic training on reducing discomfort. Daily notebook computer use of 4 h was confirmed as a risk factor. Without some form of ergonomic intervention, these students are likely to enter the workforce with poor computing habits, which places them on the road to future injuries as technology continues to play a dominant role in their lives.

  6. Notebook computer use on a desk, lap and lap support: effects on posture, performance and comfort.

    Science.gov (United States)

    Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T

    2010-01-01

    This study quantified postures of users working on a notebook computer situated in their lap and tested the effect of using a device designed to increase the height of the notebook when placed on the lap. A motion analysis system measured head, neck and upper extremity postures of 15 adults as they worked on a notebook computer placed on a desk (DESK), the lap (LAP) and a commercially available lapdesk (LAPDESK). Compared with the DESK, the LAP increased downwards head tilt 6 degrees and wrist extension 8 degrees . Shoulder flexion and ulnar deviation decreased 13 degrees and 9 degrees , respectively. Compared with the LAP, the LAPDESK decreased downwards head tilt 4 degrees , neck flexion 2 degrees , and wrist extension 9 degrees. Users reported less discomfort and difficulty in the DESK configuration. Use of the lapdesk improved postures compared with the lap; however, all configurations resulted in high values of wrist extension, wrist deviation and downwards head tilt. STATEMENT OF RELEVANCE: This study quantifies postures of users working with a notebook computer in typical portable configurations. A better understanding of the postures assumed during notebook computer use can improve usage guidelines to reduce the risk of musculoskeletal injuries.

  7. Usability Evaluation of Notebook Computers and Cellular Telephones Among Users with Visual and Upper Extremity Disabilities

    OpenAIRE

    Mooney, Aaron Michael

    2002-01-01

    Information appliances such as notebook computers and cellular telephones are becoming integral to the lives of many. These devices facilitate a variety of communication tasks, and are used for employment, education, and entertainment. Those with disabilities, however, have limited access to these devices, due in part to product designs that do not consider their special needs. A usability evaluation can help identify the needs and difficulties those with disabilities have when using a pro...

  8. A Laboratory Notebook System

    OpenAIRE

    Schreiber, Andreas

    2012-01-01

    Many scientists are using a laboratory notebook when conducting experiments. The scientist documents each step, either taken in the experiment or afterwards when processing data. Due to computerized research systems, acquired data increases in volume and becomes more elaborate. This increases the need to migrate from originally paper-based to electronic notebooks with data storage, computational features and reliable electronic documentation. This talks describes a laboratory notebook bas...

  9. Air-borne noise of thermal module and system for notebook personal computers:experimental study

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Thermal performance is the most important issue to be considered when a thermal module is designed for a notebook personal computer (PC).Because the fan causes air-borne noise and affects the user's comfort,the acoustic characteristics of the module attract more attention.Experiments were conducted to study the noise sources,the noise characteristic and the main factors influencing the noise level.The difference between the air-borne noise of the thermal module and the whole computer system was analyzed and its propagating characteristics were derived.The influence of I/O ports on the air-borne noise was also studied experimentally.

  10. Vocabulary notebooks

    OpenAIRE

    KOZETA HYSO

    2012-01-01

    Vocabulary notebooks are one way of promoting learner independence. Introducing vocabulary notebooks to provide the learners with an area of language learning where they could be given a relatively high level of independence that would build their confidence in their ability to act independently in terms of vocabulary learning. This article is focused on the effectiveness of keeping the vocabulary notebooks to empower the learner’s independence on their foreign language learning and also to e...

  11. IPython notebook essentials

    CERN Document Server

    Martins, L Felipe

    2014-01-01

    If you are a professional, student, or educator who wants to learn to use IPython Notebook as a tool for technical and scientific computing, visualization, and data analysis, this is the book for you. This book will prove valuable for anyone that needs to do computations in an agile environment.

  12. Changes in posture through the use of simple inclines with notebook computers placed on a standard desk.

    Science.gov (United States)

    Asundi, Krishna; Odell, Dan; Luce, Adam; Dennerlein, Jack T

    2012-03-01

    This study evaluated the use of simple inclines as a portable peripheral for improving head and neck postures during notebook computer use on tables in portable environments such as hotel rooms, cafés, and airport lounges. A 3D motion analysis system measured head, neck and right upper extremity postures of 15 participants as they completed a 10 min computer task in six different configurations, all on a fixed height desk: no-incline, 12° incline, 25° incline, no-incline with external mouse, 25° incline with an external mouse, and a commercially available riser with external mouse and keyboard. After completion of the task, subjects rated the configuration for comfort and ease of use and indicated perceived discomfort in several body segments. Compared to the no-incline configuration, use of the 12° incline reduced forward head tilt and neck flexion while increasing wrist extension. The 25° incline further reduced head tilt and neck flexion while further increasing wrist extension. The 25° incline received the lowest comfort and ease of use ratings and the highest perceived discomfort score. For portable, temporary computing environments where internal input devices are used, users may find improved head and neck postures with acceptable wrist extension postures with the utilization of a 12° incline.

  13. Development and Validation of a Portable Hearing Self-Testing System Based on a Notebook Personal Computer.

    Science.gov (United States)

    Liu, Yan; Yang, Dong; Xiong, Fen; Yu, Lan; Ji, Fei; Wang, Qiu-Ju

    2015-09-01

    Hearing loss affects more than 27 million people in mainland China. It would be helpful to develop a portable and self-testing audiometer for the timely detection of hearing loss so that the optimal clinical therapeutic schedule can be determined. The objective of this study was to develop a software-based hearing self-testing system. The software-based self-testing system consisted of a notebook computer, an external sound card, and a pair of 10-Ω insert earphones. The system could be used to test the hearing thresholds by individuals themselves in an interactive manner using software. The reliability and validity of the system at octave frequencies of 0.25 Hz to 8.0 kHz were analyzed in three series of experiments. Thirty-seven normal-hearing particpants (74 ears) were enrolled in experiment 1. Forty individuals (80 ears) with sensorineural hearing loss (SNHL) participated in experiment 2. Thirteen normal-hearing participants (26 ears) and 37 participants (74 ears) with SNHL were enrolled in experiment 3. Each participant was enrolled in only one of the three experiments. In all experiments, pure-tone audiometry in a sound insulation room (standard test) was regarded as the gold standard. SPSS for Windows, version 17.0, was used for statistical analysis. The paired t-test was used to compare the hearing thresholds between the standard test and software-based self-testing (self-test) in experiments 1 and 2. In experiment 3 (main study), one-way analysis of variance and post hoc comparisons were used to compare the hearing thresholds among the standard test and two rounds of the self-test. Linear correlation analysis was carried out for the self-tests performed twice. The concordance was analyzed between the standard test and the self-test using the kappa method. p 0.05) but were significantly different at frequencies of 1000, 2000, and 4000 Hz (p 0.05). The overall sensitivity of the self-test method was 97.6%, and the specificity was 98.3%. The sensitivity was

  14. Thin-Wall Aluminum Die-Casting Technology for Development of Notebook Computer Housing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Silicon-based aluminum casting alloys are known to be one of the most widely used alloy systems mainly due to their superior casting characteristics and unique combination of mechanical and physical properties.However, manufacturing of thin-walled aluminum die-casting components, less than 1.0 mm in thickness, is generally known to be very difficult task to achieve aluminum casting alloys with high fluidity. Therefore, in this study, the optimal die-casting conditions for producing 297 mm×210 mm×0.7 mm thin-walled aluminum component was examined experimentally by using 2 different gating systems, tangential and split type, and vent design. Furthermore, computational solidification simulation was also conducted. The results showed that split type gating system was preferable gating design than tangential type gating system at the point of view of soundness of casting and distortion generated after solidification. It was also found that proper vent design was one of the most important factors for producing thin-wall casting components because it was important for the fulfillment of the thin-wall cavity and the minimization of the casting distortion.

  15. Analysis of Discomfort Factors for Using Notebook Computer%基于笔记本电脑使用的不舒适因素分析

    Institute of Scientific and Technical Information of China (English)

    王丽君; 李黎; 张帆

    2013-01-01

    On the background of generally using notebook computer instead of desk PC in daily life, with sitting posture ,office table&chair and notebook’s structure as the starting point, the paper analyzed the effect of using notebook computer on comfortable sitting posture, main factors of affecting good health and work efifciency that poor sitting may lead to low back pain, unreasonable height design of ofifce furniture could increase the activity of part muscle, and the compact form of notebook is not suitable for people’s long time using. In order to reducing body injury from irrational use of notebook, poor posture and sedentary, the paper made suggestions for improving computer using that people should strengthen the cognition of keeping sitting in a healthy way and reasonably changing the sitting posture, and add auxiliary input device and part body supporting.%本文以人们在日常工作中普遍使用笔记本电脑代替台式机电脑为背景,分析了使用笔记本电脑对坐姿舒适的影响,以坐姿、办公桌椅和笔记本电脑结构等影响因素为切入点,进而具体分析了不良坐姿易导致下背痛;办公桌椅的高度设计不合理会增加身体局部肌肉的活动增加;笔记本电脑的紧凑结构不利于人们长期使用等影响身体健康和工作效率的主要因素。提出了加强人们对保持健康坐姿及合理变换坐姿重要性的认知、增加辅助输入设备和局部身体支撑等改善笔记本电脑使用的建议,以减少人们在工作中由于不合理使用笔记本电脑、不良姿势以及久坐造成的身体损伤。

  16. Design of Injection Mold for the Notebook Computer Monitor Shell%笔记本电脑显示屏外壳注塑模设计

    Institute of Scientific and Technical Information of China (English)

    关琳

    2012-01-01

    Through analyzing the notebook computer monitor shell structure, designed a convenient cost-effective processing mold structure, using hot runner systems, later thin slider, inner core-pulling, exhaust trough to resolve products and a large plastic flow caused by such issues as bad, exhaust side, internal and external hook. It has a certain reference function to similar mold design.%通过对笔记本电脑显示屏外壳结构分析,设计一种加工方便,节约成本的模具结构,采用热流道系统,外侧滑块,内侧抽芯和排气槽等结构,以解决塑件即薄又大流动不好,排气不良等问题,内外卡钩等问题,为同类模具设计提供参考.

  17. IPython/Jupyter Notebooks

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Jupyter notebooks are pretty amazing. They can run code and keep in one place visualizations, equations and pretty-formatted text as well. With notebooks it's extremely easy to produce and share results in a comprehensible format and so they make the perfect tool for data analysis. I'll give a sneak peek on their wide range of uses and what we are doing at Indico to help their adoption at CERN.

  18. Views of the first cohort of HKU notebook programme participants

    OpenAIRE

    Blurton, CG; Lee, ACK

    2002-01-01

    In 1998, the University of Hong Kong was the first tertiary institution in Asia to implement a campuswide notebook computer programme, where each incoming student is enabled to own a personal notebook computer. At the beginning of each year, incoming students are surveyed to collect baseline data about their self-reported computer skills and their attitudes about computer use in education. In mid-May 2001, at the end of their undergraduate education, the 1998 cohort was surveyed again to gaug...

  19. Reliability of visual acuity measurements taken with a notebook and a tablet computer in participants who were illiterate to Roman characters.

    Science.gov (United States)

    Ruamviboonsuk, Paisan; Sudsakorn, Napitchareeya; Somkijrungroj, Thanapong; Engkagul, Chayanee; Tiensuwan, Montip

    2012-03-01

    Electronic measurement of visual acuity (VA) has been proposed and adopted as a method of determining VA scores in clinical research. Characters (optotypes) are displayed on a monitor screen and the examinee selects a match and inputs his choice to another electronic device. Unfortunately, the optotypes, called Sloan letters, in the standard protocol are 10 Roman characters. This limits their practicabilityfor measuring VA of patients who are illiterate to these characters. The authors introduced a method of displaying the Sloan letters one by one on a notebook and all 10 Sloan letters on a tablet computer screen. The former is for testing the patients whereas the latter is for them to input their responses by tapping on a letter that matches the one on the notebook screen. To assess test-retest reliability of VA scores determined with this method. Participants without ocular abnormality were recruited to have their right eyes measured with the same VA measurement method twice, one week apart. Those who were illiterate to Roman characters were enrolled for the aforementioned method for measuring their VA (Tablet group). A 15-inch display notebook computer and a 9-inch display tablet computer (iPad) communicated via a local wireless data network provided by a Wi-Fi router. Those who understood Roman characters were enrolled to have measurements with a 17-inch desktop computer and an infrared wireless keyboard (Keyboard group). Both methods used the same protocols and software for VA measurements. Reliability of VA scores obtained from each group was assessed by the confidence interval (CI) of the difference of the scores from the test and retest. The t test was used to analyze differences in mean VA scores between the test and retest in each group with p < 0.05 determined as statistically significant. There were 49 and 50 participants in the Tablet and Keyboard group respectively. The 95% CI of the difference between the scores from the test and retest in each group

  20. Hibernate A Developer's Notebook

    CERN Document Server

    Elliott, James

    2004-01-01

    Do you enjoy writing software, except for the database code? Hibernate:A Developer's Notebook is for you. Database experts may enjoy fiddling with SQL, but you don't have to--the rest of the application is the fun part. And even database experts dread the tedious plumbing and typographical spaghetti needed to put their SQL into a Java program. Hibernate: A Developers Notebook shows you how to use Hibernate to automate persistence: you write natural Java objects and some simple configuration files, and Hibernate automates all the interaction between your objects and the database. You don't

  1. Spring A Developer's Notebook

    CERN Document Server

    Tate, Bruce A

    2009-01-01

    This no-nonsense book quickly gets you up to speed on the new Spring open source framework. Favoring examples and practical application over theory, Spring: A Developer's Notebook features 10 code-intensive labs that'll reveal the many assets of this revolutionary, lightweight architecture. In the end, you'll understand how to produce simple, clean, and effective applications.

  2. Reuse That Notebook!

    Science.gov (United States)

    Lener, Elizabeth

    2010-01-01

    How many scientists throw out their notebooks at the end of each year and start over no matter how many empty pages remain? How many of them approach a new research question or experiment without using knowledge gained from the previous years? The answer to each of these questions is of course, few if any, yet we ask our students to do that every…

  3. Rethinking Laboratory Notebooks

    DEFF Research Database (Denmark)

    Klokmose, Clemens Nylandsted; Zander, Pär-Ola

    2010-01-01

    with our study is to produce design relevant knowledge that can envisage an ICT solution that keeps as many advantages of paper as possible, but with the strength of electronic laboratory notebooks as well. Rather than assuming that users are technophobic and unable to appropriate state of the art software...

  4. Porting AIX onto the Student Electronic Notebook

    OpenAIRE

    Ioannidis, John; Gerald Q. Maguire Jr.; Ben-Shaul, Israel; Levedopoulos, Marios; Liu, Micky

    1990-01-01

    We describe the Student Electronic Notebook and the process of porting IBM’s AIX 1.1 to run on it. We believe that portable workstation-class machines connected by wireless networks and dependent on a computational and informational infrastructure raise a number of important issues in operating systems and distributed computation (e.g., the partitioning of tasks between workstations and infrastructure), and therefore the development of such machines and their software is important. We conclud...

  5. Optimising JS visualisation for notebooks

    CERN Document Server

    She, Harry

    2017-01-01

    In a large number of notebooks that are created and used by users of CERN, many important components are those related to graphical visualization. These include, but are not limited to graphs and histograms of data and events from physical detectors. However, the information used to display the ROOT graphical primitives is too comprehensive and hence cause the physical space requirements of the notebooks to grow drastically as the number of graphs and plots increases. Analysis has been performed to trim out redundant information from the notebooks, as well as providing insight into browsing ROOT objects within a notebook.

  6. A Review of New Brunswick's Dedicated Notebook Research Project: One-to-One Computing--A Compelling Classroom-Change Intervention

    Science.gov (United States)

    Milton, Penny

    2008-01-01

    The Canadian Education Association (CEA) was commissioned by Hewlett-Packard Canada to create a case study describing the development, implementation and outcomes of New Brunswick's Dedicated Notebook Research Project. The New Brunswick Department of Education designed its research project to assess impacts on teaching and learning of dedicated…

  7. Analysis of Samsung notebook strategy

    OpenAIRE

    Xu, Rui

    2009-01-01

    Under the fast growing background, Notebook industry draws lots attention from IT companies. It is expected that by 2010, the sales of notebook will overrun the sales of desktop. After the SWOT and SPACE analysis, we recommend Samsung to improve from two different perspectives, first, set up a clear and detailed marketing goal and plan, and ensure the enforcing of such strategies. Secondly, manage the distribution channels effectively and efficiently. There are two trends Samsu...

  8. Python for signal processing featuring IPython notebooks

    CERN Document Server

    Unpingco, José

    2013-01-01

    This book covers the fundamental concepts in signal processing illustrated with Python code and made available via IPython Notebooks, which are live, interactive, browser-based documents that allow one to change parameters, redraw plots, and tinker with the ideas presented in the text. Everything in the text is computable in this format and thereby invites readers to ""experiment and learn"" as they read. The book focuses on the core, fundamental principles of signal processing. The code corresponding to this book uses the core functionality of the scientific Python toolchain that should remai

  9. Doing physics with scientific notebook a problem solving approach

    CERN Document Server

    Gallant, Joseph

    2012-01-01

    The goal of this book is to teach undergraduate students how to use Scientific Notebook (SNB) to solve physics problems. SNB software combines word processing and mathematics in standard notation with the power of symbolic computation. As its name implies, SNB can be used as a notebook in which students set up a math or science problem, write and solve equations, and analyze and discuss their results. Written by a physics teacher with over 20 years experience, this text includes topics that have educational value, fit within the typical physics curriculum, and show the benefits of using SNB.

  10. 碳纤维复合材料在笔记本电脑外壳上的应用%Application of carbon fiber composite in forming notebook computer case

    Institute of Scientific and Technical Information of China (English)

    边彬辉; 尹高喜; 张赛军; 练绪波; 汪智勇

    2011-01-01

    Carbon fiber has good resistance to weather and corrosion and shows excellent Performance: light and strong. On identical strength conditions, the thickness and weight of notebook computer case made of carbon fiber compounded with engineering plastic, epoxy resin or PMMA can be reduced by 50%~60% and 40%~50% respectively compared to that made of traditional engineering plastics, In this paper, the application of four kinds of carbon fiber composites, short carbon fiber reinforced composite plastic, thermosetting continuous carbon fiber composite sheet, thermoplastic continuous carbon fiber composite sheet and carbon fiber composite film in notebook computer case was introduced.%介绍了短碳纤维增强复合塑料、热固性连续性碳纤维复合板材、热塑性连续性碳纤维复合板材、碳纤维复合薄膜等4种碳纤维复合材料在笔记本电脑外壳上的应用,并作出了展望.碳纤维与工程塑料、环氧树脂、PMMA等树脂复合后制备的笔记本电脑外壳,在同等强度条件下,相比传统工程塑料制备的笔记本电脑外壳,厚度可减薄50%~60%,重量减轻40%~50%,而且拥有良好的耐候性和耐腐蚀性.指出碳纤维复合材料研究的快速发展将会推动笔记本电脑市场的巨大变革,使笔记本电脑向着"更轻、更薄、更强"方向快速发展.

  11. The GenePattern Notebook Environment.

    Science.gov (United States)

    Reich, Michael; Tabor, Thorin; Liefeld, Ted; Thorvaldsdóttir, Helga; Hill, Barbara; Tamayo, Pablo; Mesirov, Jill P

    2017-08-23

    Interactive analysis notebook environments promise to streamline genomics research through interleaving text, multimedia, and executable code into unified, sharable, reproducible "research narratives." However, current notebook systems require programming knowledge, limiting their wider adoption by the research community. We have developed the GenePattern Notebook environment (http://www.genepattern-notebook.org), to our knowledge the first system to integrate the dynamic capabilities of notebook systems with an investigator-focused, easy-to-use interface that provides access to hundreds of genomic tools without the need to write code. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The Notebook of My Childhood

    Institute of Scientific and Technical Information of China (English)

    王潇涵

    2011-01-01

    About three years ago, I wrote this small poem in a beautiful notebook in memory of my happy childhood. It was missing for a long time. But luckily, in my winter holiday when I was cleaning my room I found it on the bottom of a pile of books. I was so exc

  13. seismo-live: Training in Seismology using Jupyter Notebooks

    Science.gov (United States)

    Igel, Heiner; Krischer, Lion; van Driel, Martin; Tape, Carl

    2017-04-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

  14. Seismo-Live: Training in Seismology with Jupyter Notebooks

    Science.gov (United States)

    Krischer, Lion; Tape, Carl; Igel, Heiner

    2016-04-01

    Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.

  15. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for public data releases and outreach

    CERN Document Server

    Banda, Tea; CERN. Geneva. EP Department

    2016-01-01

    The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.

  16. Monitoring and Technical Assistance Review System Notebook

    Science.gov (United States)

    Administration for Children & Families, 2008

    2008-01-01

    This notebook provides guidance on the Monitoring and Technical Assistance Review System (MTARS). The manual is intended for use by Administration on Developmental Disabilities (ADD) staff who manage MTARS and by MTARS reviewers who conduct site visit activities. The notebook is also designed to help Councils, Protection and Advocacy Systems, and…

  17. Notebooks as didactic tool in design education

    NARCIS (Netherlands)

    Van den Toorn, M.W.M.; Have, R.

    2012-01-01

    Notebooks are an important didactic tool both for students and teaching staff. The idea of notebooks is that the daily work and thinking is reflected in notes, drawings, sketches, diagrams. Keeping track of the content of daily work can give an idea of the evolution and development of ideas. Especia

  18. An electronic laboratory notebook based on HTML forms

    Energy Technology Data Exchange (ETDEWEB)

    Marstaller, J.E.; Zorn, M.D.

    1995-10-01

    The electronic notebook records information that has traditionally been kept in handwritten laboratory notebooks. It keeps detailed information about the progress of the research , such as the optimization of primers, the screening of the primers and, finally, the mapping of the probes. The notebook provides two areas of services: Data entry, and reviewing of data in all stages. The World wide Web browsers, with HTML based forms provide a fast and easy mechanism to create forms-based user interfaces. The computer scientist can sit down with the biologist and rapidly make changes in response to the user`s comments. Furthermore the HTML forms work equally well on a number of different hardware platforms; thus the biologists may continue using their Macintosh computers and find a familiar interface if they have to work on a Unix workstation. The web browser can be run from any machine connected to the Internet: thus the users are free to enter or view information even away from their labs at home or while on travel. Access can be restricted by password and other means to secure the confidentiality of the data. A bonus that is hard to implement otherwise is the facile connection to outside resources. Linking local information to data in public databases is only a hypertext link away with little or no additional programming efforts.

  19. Integration of TMVA Output into Jupyter notebooks

    CERN Document Server

    Saliji, Albulena

    2016-01-01

    The purpose of this report is to describe the work that I have been doing during these past eight weeks as a Summer Student at CERN. The task which was assigned to me had to do with the integration of TMVA Output into Jupyter notebooks. In order to integrate the TMVA Output into the Jupyter notebook, first, improvement of the TMVA Output in the terminal was required. Once the output was improved, it needed to be transformed into HTML output and at the end it would be possible to integrate that output into the Jupyter notebook.

  20. Using Jupyter Notebooks for Interactive Space Science Simulations

    Science.gov (United States)

    Schmidt, Albrecht

    2016-04-01

    Jupyter Notebooks can be used as an effective means to communicate scientific ideas through Web-based visualisations and, at the same time, give a user more than a pre-defined set of options to manipulate the visualisations. To some degree, even computations can be done without too much knowledge of the underlying data structures and infrastructure to discover novel aspects of the data or tailor view to users' needs. Here, we show how to combine Jupyter Notebooks with other open-source tools to provide rich and interactive views on space data, especially the visualisation of spacecraft operations. Topics covered are orbit visualisation, spacecraft orientation, instrument timelines as well as performance analysis of mission segments. Technically, also the re-use and integration of existing components will be shown, both on the code level as well on the visualisation level so that the effort which was put into the development of new components could be reduced. Another important aspect is the bridging of the gap between operational data and the scientific exploitation of the payload data, for which also a way forward will be shown. A lesson learned from the implementation and use of a prototype is the synergy between the team who provisions the notebooks and the consumers, who both share access to the same code base, if not resources; this often simplifies communication and deployment.

  1. Jupyter Notebooks as tools for interactive learning of Concepts in Structural Geology and efficient grading of exercises.

    Science.gov (United States)

    Niederau, Jan; Wellmann, Florian; Maersch, Jannik; Urai, Janos

    2017-04-01

    Programming is increasingly recognised an important skill for geoscientists - however, the hurdle to jump into programming for students with little or no experience can be high. We present here teaching concepts on the basis of Jupyter notebooks that combine, in an intuitive way, formatted instruction text with code cells in a single environment. This integration allows for an exposure to programming on several levels: from a complete interactive presentation of content, where students require no or very limited programming experience, to highly complex geoscientific computations. We consider these notebooks therefore as an ideal medium to present computational content to students in the field of geosciences. We show here how we use these notebooks to develop digital documents in Python for undergrad-students, who can then learn about basic concepts in structural geology via self-assessment. Such notebooks comprise concepts such as: stress tensor, strain ellipse, or the mohr circle. Students can interactively change parameters, e.g. by using sliders and immediately see the results. They can further experiment and extend the notebook by writing their own code within the notebook. Jupyter Notebooks for teaching purposes can be provided ready-to-use via online services. That is, students do not need to install additional software on their devices in order to work with the notebooks. We also use Jupyter Notebooks for automatic grading of programming assignments in multiple lectures. An implemented workflow facilitates the generation, distribution of assignments, as well as the final grading. Compared to previous grading methods with a high percentage of repetitive manual grading, the implemented workflow proves to be much more time efficient.

  2. The Lenovo X-60 Convertible Notebook Tablet PC: An Assistive Technology Tool Review

    Science.gov (United States)

    Harvey-Carter, Liz

    2007-01-01

    The purpose of this paper is to examine the suitability of the newest generation of Lenovo X60 tablet personal computers (PCs) as assistive technology (AT) devices for students with disabilities. Because of the vast selection of tablet PCs and convertible notebooks currently available on the market, this paper will confine itself to assessing one…

  3. Nature's Notebook 2010: Data & participant summary

    Science.gov (United States)

    Crimmins, Theresa M.; Rosemartin, Alyssa H.; Marsh, R. Lee; Denny, Ellen G.; Enquist, Carolyn A.F.; Weltzin, Jake F.

    2011-01-01

    The USA National Phenology Network (USA‐NPN) seeks to engage volunteer observers to collect phenology observations of plants and animals using consistent standards and to contribute to the USANPN National Phenology Database (NPDb). The commencement of 2010 marked the second functional year of Nature’s Notebook, the online phenology observation program developed by the National Coordinating Office (NCO) of the USA‐NPN. The addition of animal species for monitoring was a major enhancement to Nature’s Notebook in 2010.

  4. Based on the principal component analysis notebook computer characteristic index analysis%基于主成份分析法的笔记本电脑特征性指标分析

    Institute of Scientific and Technical Information of China (English)

    业崇凡

    2013-01-01

    Based on the market is now the most popular 14 brand computer and the most popular consumer concern nine indicators on the basis of statistical investigation,using the method of principal component analysis,the feature classification. The results show that the memory capacity,hard drive capacity,reference price and memory capacity can be put into the first category,screen size,weight,the biggest memory capacity and quality assurance time can be for the same category,and camera pixel is a separate category.From now on investigation and analysis on the results,to explore the main influence price index,in combination with the objective analysis,first can make consumers to laptop computers have a relatively clear understanding,secondly to purchase can provide some professional help,avoid to consumers by the businessman of the hype and confused by the stunt. It should be pointed out that,due to the high level of science and technology of the note,various parameters,but also the brand effect and can't quantitative indicators,so this paper try to do it ful y to the general customers about choose the notebook guidance and help.%  本文在对现市面上最畅销的14个品牌电脑且最受消费者关心的9个指标进行统计调查的基础上,运用主成分分析的方法,对其进行特征分类。结果表明,内存容量,硬盘容量,参考价格和显存容量可以归为第一类,屏幕尺寸,重量,最大内存容量和质保时间可以归为同一类别,而摄像头像素则单独成为一类别。从此调查分析结果出发,探究主要影响价格的指标,同时结合客观的分析,首先能够使消费者对笔记本电脑有一个较为清晰地认识,其次能够为购买时能够提供一定的专业帮助,避免消费者被商家的炒作和噱头所迷惑。需要指出的是,由于笔记本的科技水平较高,参数繁多,而且还有品牌效应等无法定量化的指标,所以文中尽量做到全

  5. Thar`s gold in them thar notebooks: benefits of laboratory notebooks in the government archive

    Energy Technology Data Exchange (ETDEWEB)

    O`Canna, M.

    1996-01-01

    As Archive Coordinator for Sandia National Laboratories Corporate Archives, I am responsible for promoting the preservation and value of Sandia`s history. Today I will talk about one important part of Sandia`s historical record--the laboratory notebook. I will start with some brief background on Sandia National Laboratories, including the Laboratories` mission and an example of how the gold in one lab notebook helped to give a picture of Sandia`s early history. Next, I will talk about the use of notebooks at Sandia Labs, how they represent technology developed at Sandia, and include noteworthy examples of how patent information has been collected, used, and released to the public. Then, I will discuss how the National Competitiveness Technology Transfer Act of 1989 authorized technology transfer initiatives and the exclusive use of patented information, resulting in many golden opportunities for the national laboratories to work with private industry to further technology. I will briefly discuss laboratory notebook retention schedules and mention a new initiative to better utilize Laboratory notebooks. And, finally, I will summarize how the `gold` in laboratory notebooks in government archives are a reflection of the valuable and extensive research authorized and funded by the government to benefit the public.

  6. Nonlinear science an interactive Mathematica notebook

    CERN Document Server

    Campbell, David K; Tanury, Thomas A

    2012-01-01

    This interactive Mathematica(TM) notebook provides a ready-made tool by which users can undertake their own mathematical experiments and explore the behavior of non-linear systems, from chaos in low-dimensional maps and coupled ordinary differential equations to solitons and coherent structures in nonlinear partial differential equations and "intrisic localized modes" and "discrete breathers" in extended lattice systems.

  7. Aspen Notebook: Cable and Continuing Education.

    Science.gov (United States)

    Adler, Richard; Baer, Walter S.

    This is the first of a planned series of Aspen Notebooks on cable television (CATV). Part I reports on research conducted by the Aspen Workshop on Uses of the Cable. It describes the status of continuing education and the history of educational television and explores the prospects created by cable's development for extending access to continuing…

  8. Smaller external notebook mice have different effects on posture and muscle activity.

    Science.gov (United States)

    Oude Hengel, Karen M; Houwink, Annemieke; Odell, Dan; van Dieën, Jaap H; Dennerlein, Jack T

    2008-07-01

    Extensive computer mouse use is an identified risk factor for computer work-related musculoskeletal disorders; however, notebook computer mouse designs of varying sizes have not been formally evaluated but may affect biomechanical risk factors. Thirty adults performed a set of mouse tasks with five notebook mice, ranging in length from 75 to 105 mm and in width from 35 to 65 mm, and a reference desktop mouse. An electro-magnetic motion analysis system measured index finger (metacarpophalangeal joint), wrist and forearm postures, and surface electromyography measured muscle activity of three extensor muscles in the forearm and the first dorsal interosseus. The smallest notebook mice were found to promote less neutral postures (up to 3.2 degrees higher metacarpophalangeal joint adduction; 6.5 degrees higher metacarpophalangeal joint flexion, 2.3 degrees higher wrist extension) and higher muscle activity (up to 4.1% of maximum voluntary contraction higher wrist extensor muscle activity). Participants with smaller hands had overall more non-neutral postures than participants with larger hands (up to 5.6 degrees higher wrist extension and 5.9 degrees higher pronation); while participants with larger hands were more influenced by the smallest notebook mice (up to 3.6 degrees higher wrist extension and 5.5% of maximum voluntary contraction higher wrist extensor values). Self-reported ratings showed that while participants preferred smaller mice for portability; larger mice scored higher on comfort and usability. The smallest notebook mice increased the intensity of biomechanical exposures. Longer term mouse use could enhance these differences, having a potential impact on the prevention of work-related musculoskeletal disorders.

  9. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook.

    Science.gov (United States)

    Stevens, Jean-Luc R; Elver, Marco; Bednar, James A

    2013-01-01

    Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change.

  10. Generalization of an Active Electronic Notebook for Teaching Multiple Programming Languages

    OpenAIRE

    Torabzadeh-Tari, Mohsen; Fritzson, Peter; Pop, Adrian; Sjölund, Martin

    2010-01-01

    In this paper we present a generalization of the active electronic notebook, OMNotebook, for handling multiple programming languages for educational purposes. OMNotebook can be an alternative or complementary tool to the traditional teaching method with lecturing and reading textbooks. Experience shows that using such an electronic book will lead to more engagement from the students. OMNotebook can contain technical computations and text, as well as graphics. Hence it is a suitable tool for t...

  11. Methods and Strategies: Digital Notebooks for Digital Natives

    Science.gov (United States)

    Miller, Bridget; Martin, Christie

    2016-01-01

    The idea of notebooking is not new in the science classroom. Since the mid-1970s, writing has been found to facilitate students' critical thinking and learning across a variety of content areas. For science educators, notebooks have become an essential tool for supporting students' scientific inquiry in and across concepts. Scientific notebooks…

  12. Using scientists' notebooks to foster authentic scientific practices

    Science.gov (United States)

    Atkins, Leslie J.; Salter, Irene Y.

    2013-01-01

    Scientific Inquiry is an introductory undergraduate course for preservice elementary teachers that aims to engage students in authentic scientific practices where these practices are not viewed as a mere course requirement but are understood as essential practices for constructing knowledge in the discipline. Many of these practices (e.g., representational practices, control-of-variables) evolve over the course of the semester as we work to answer complex questions. However, we hoped to have students- from the start of the term- keep detailed scientific notebooks. We describe an activity designed to foster practices related to the use of scientific notebooks, detail how we use images from scientists' notebooks, discuss the rubrics students create for their own notebooks, and share outcomes, including images of students' notebooks and students' reactions to the activity. Funding provided by NSF ♯0837058.

  13. Notebook-Klassen an einer Hauptschule : Eine Einzelfallstudie zur Wirkung eines Notebook-Einsatzes auf Unterricht, Schüler und Schule

    OpenAIRE

    Häuptle, Eva

    2007-01-01

    Forschungsgegenstand sind 3 Notebook-Klassen im Mittlere-Reife-Zug einer bayerischen Hauptschule: eine 7. und 9. Klasse im ersten Notebook-Jahr sowie eine 10. Klasse im zweiten Notebook-Jahr. Die Notebook-Verteilung entspricht einem "1-zu-1"-Modell, d. h. jeder Schüler arbeitet mit seinem eigenen Notebook, das von den Eltern komplett finanziert wurde und persönlicher Besitz des Schülers ist. Das Notebook wird im Unterricht des Klassenleiters eingesetzt, der ca. 15 Unterrichtsstunden in seiner...

  14. Notebook-Klassen an einer Hauptschule : Eine Einzelfallstudie zur Wirkung eines Notebook-Einsatzes auf Unterricht, Schüler und Schule

    OpenAIRE

    Häuptle, Eva

    2007-01-01

    Forschungsgegenstand sind 3 Notebook-Klassen im Mittlere-Reife-Zug einer bayerischen Hauptschule: eine 7. und 9. Klasse im ersten Notebook-Jahr sowie eine 10. Klasse im zweiten Notebook-Jahr. Die Notebook-Verteilung entspricht einem "1-zu-1"-Modell, d. h. jeder Schüler arbeitet mit seinem eigenen Notebook, das von den Eltern komplett finanziert wurde und persönlicher Besitz des Schülers ist. Das Notebook wird im Unterricht des Klassenleiters eingesetzt, der ca. 15 Unterrichtsstunden in seiner...

  15. Feminist Interpretation of Doris Lessing's The Golden Notebook

    Institute of Scientific and Technical Information of China (English)

    王彦

    2010-01-01

    Doris Lessing is undoubtedly one of the most influential women writers in the 20th century. In 1962,her masterpiece The Golden Notebook was published. It is regarded as the companion volume of Simon de Beauvoir's The Second Sex. The novel soon became popular among the feminists because of its realistic description about women's independent consciousness and their living condition. This thesis has been written with the aim to interpret The Golden Notebook from the perspective of feminism. The novel's theme,structure,characters,narrative style serve well for the aim of feminist interpretation. The thesis also innovatively discusses the challenges to feminism reflected from The Golden Notebook.

  16. Using the Ginga Science Viewer in Jupyter Notebooks

    Science.gov (United States)

    Jeschke, Eric

    2016-03-01

    We present a new capability for the Ginga science viewer toolkit: the ability to create web browser-based image viewers. Full featured mini-viewers can be created easily in Jupyter/IPython notebooks. Viewers can be manipulated from the notebook and two-way interactions are facilitated. Viewers can be shared between users by connected to the same viewer URL. A possible good solution for remote data reduction/exploration workflows and collaboration. An example notebook can be found here: https://gist.github.com/ejeschke/6067409

  17. Electronic Engineering Notebook: A software environment for research execution, documentation and dissemination

    Science.gov (United States)

    Moerder, Dan

    1994-01-01

    The electronic engineering notebook (EEN) consists of a free form research notebook, implemented in a commercial package for distributed hypermedia, which includes utilities for graphics capture, formatting and display of LaTex constructs, and interfaces to the host operating system. The latter capability consists of an information computer-aided software engineering (CASE) tool and a means to associate executable scripts with source objects. The EEN runs on Sun and HP workstations. The EEN, in day-to-day use can be used in much the same manner as the sort of research notes most researchers keep during development of projects. Graphics can be pasted in, equations can be entered via LaTex, etc. In addition, the fact that the EEN is hypermedia permits easy management of 'context', e.g., derivations and data can contain easily formed links to other supporting derivations and data. The CASE tool also permits development and maintenance of source code directly in the notebook, with access to its derivations and data.

  18. Electronic Laboratory Notebook on Web2py Framework

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available Proper experimental record-keeping is an important cornerstone in research and development for the purpose of auditing. The gold standard of record-keeping is based on the judicious use of physical, permanent notebooks. However, advances in technology had resulted in large amounts of electronic records making it virtually impossible to maintain a full set of records in physical notebooks. Electronic laboratory notebook systems aim to meet the stringency for keeping records electronically. This manuscript describes CyNote which is an electronic laboratory notebook system that is compliant with 21 CFP Part 11 controls on electronic records, requirements set by USA Food and Drug Administration for electronic records. CyNote is implemented on web2py framework and is adhering to the architectural paradigm of model-view-controller (MVC, allowing for extension modules to be built for CyNote. CyNote is available at http://cynote.sf.net.

  19. Smart Electronic Laboratory Notebooks for the NIST Research Environment.

    Science.gov (United States)

    Gates, Richard S; McLean, Mark J; Osborn, William A

    2015-01-01

    Laboratory notebooks have been a staple of scientific research for centuries for organizing and documenting ideas and experiments. Modern laboratories are increasingly reliant on electronic data collection and analysis, so it seems inevitable that the digital revolution should come to the ordinary laboratory notebook. The most important aspect of this transition is to make the shift as comfortable and intuitive as possible, so that the creative process that is the hallmark of scientific investigation and engineering achievement is maintained, and ideally enhanced. The smart electronic laboratory notebooks described in this paper represent a paradigm shift from the old pen and paper style notebooks and provide a host of powerful operational and documentation capabilities in an intuitive format that is available anywhere at any time.

  20. Electronic laboratory notebook: the academic point of view.

    Science.gov (United States)

    Rudolphi, Felix; Goossen, Lukas J

    2012-02-27

    Based on a requirement analysis and alternative design considerations, a platform-independent electronic laboratory notebook (ELN) has been developed that specifically targets academic users. Its intuitive design and numerous productivity features motivate chemical researchers and students to record their data electronically. The data are stored in a highly structured form that offers substantial benefits over laboratory notebooks written on paper with regard to data retrieval, data mining, and exchange of results.

  1. The effect of vocabulary notebooks on vocabulary acquisition

    OpenAIRE

    Bozkurt, Neval

    2007-01-01

    Ankara : The Department of Teaching English as a Foreign Language, Bilkent University, 2007. Thesis (Master's) -- Bilkent University, 2007. Includes bibliographical references leaves 82-87 This study investigated the effectiveness of vocabulary notebooks on vocabulary acquisition, and the attitudes of teachers and learners towards keeping vocabulary notebooks. The study was conducted with the participation of 60 pre-intermediate level students, divided into one treatment ...

  2. Interactive model evaluation tool based on IPython notebook

    Science.gov (United States)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).

  3. An Analysis of Notebook Writing in Elementary Science Classrooms. CSE Technical Report.

    Science.gov (United States)

    Baxter, Gail P.; Bass, Kristin M.; Glaser, Robert

    This study examined the use of student notebooks in three fifth-grade science classrooms during a unit on electric circuits to determine the extent to which notebooks might serve as a tool for monitoring teaching and learning. Analyses of classroom contexts indicated that teachers promoted notebook writing through explicit instructions and…

  4. Finding a Meaning: Reading, Writing, Thinking Applications: Double-Entry Notebooks, Literature Logs, Process Journals.

    Science.gov (United States)

    Wrobleski, Diane

    Three different ways of integrating writing and thinking into the classroom are using double-entry notebooks, literature logs, and process journals. In a double-entry notebook, the writer takes notes on the reading, collects direct quotations, makes observational notes, and writes fragments, lists, and images on the left side of the notebook. On…

  5. Augmented notebooks for pervasive learning in medical practice.

    Science.gov (United States)

    Bricon-Souf, Nathalie; Leroy, Nicolas; Renard, Jean-Marie

    2010-01-01

    Medical e-learning can benefit from the new technologies, and pervasive learning resources and tools worth to be introduced in the medical context. Micro-learning seems to be an interesting way for pervasive learning. But it is still difficult to propose pedagogical resources that are built by learners, from meaningful experiments. We conducted an analysis of the exchanges performed by Health care professionals in the hospital in order to understand where and when educational exchanges appear. We analyzed the type of documents exchanged. The residents' paper notebooks caught our attention first because it answers some clinician-needs and second because the computerization of such a notebook could add a collaborative dimension to the pedagogical resources. We propose a model of an augmented resident's notebook and we briefly describe an implementation using Content Management System and WIKI, before setting the discussion and the conclusion sections.

  6. Trend Analysis on the Automation of the Notebook PC Production Process

    OpenAIRE

    Chin-Ching Yeh

    2012-01-01

    Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its...

  7. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook

    Directory of Open Access Journals (Sweden)

    Jean-Luc Richard Stevens

    2013-12-01

    Full Text Available Lancet is a new, simulator-independent Python utility for succinctlyspecifying, launching, and collating results from large batches ofinterrelated computationally demanding program runs. This paperdemonstrates how to combine Lancet with IPython Notebook to provide aflexible, lightweight, and agile workflow for fully reproduciblescientific research. This informal and pragmatic approach usesIPython Notebook to capture the steps in a scientific computation asit is gradually automated and made ready for publication, withoutmandating the use of any separate application that can constrainscientific exploration and innovation. The resulting notebookconcisely records each step involved in even very complexcomputational processes that led to a particular figure or numericalresult, allowing the complete chain of events to be replicatedautomatically.Lancet was originally designed to help solve problems in computationalneuroscience, such as analyzing the sensitivity of a complexsimulation to various parameters, or collecting the results frommultiple runs with different random starting points. However, becauseit is never possible to know in advance what tools might be requiredin future tasks, Lancet has been designed to be completely general,supporting any type of program as long as it can be launched as aprocess and can return output in the form of files. For instance,Lancet is also heavily used by one of the authors in a separateresearch group for launching batches of microprocessor simulations.This general design will allow Lancet to continue supporting a givenresearch project even as the underlying approaches and tools change.

  8. An analysis of Simpson's notebook data on the wet nurse.

    Science.gov (United States)

    Mander, Rosemary

    2003-03-01

    to understand the meaning of the qualitative data included in the Notebook of wet nurses kept by James Young Simpson. quantitative and qualitative analysis of data in an historical document. a list of wet nurses kept by a 'Professor of Midwifery' in mid-19th century Edinburgh. the Notebook lists the names and other details of 749 women. the Notebook indicates how the wet nurse was recruited, the implications for her baby, how she negotiated her role and the decision-making around her recruitment. the ambiguity of this medical pioneer's decision-making is demonstrated. Simpson's scientific credentials may have featured much rhetoric. While in the forefront of many obstetric and medical developments, Simpson was regressive in his support for wet nursing. The social input into the selection of the wet nurse has not been identified previously. the woman's ability to negotiate her terms of employment emerges. The social determinants of baby feeding decisions, identified in this document, have assumed greater significance since the time that this Notebook was written.

  9. Notebooks in der Hochschullehre. Didaktische und strukturelle Implikationen

    Directory of Open Access Journals (Sweden)

    Marco Kalz

    2017-08-01

    Full Text Available Im Rahmen des vom bmb+f geförderten Projektes „Notebook-Universitäten“ sind seit Juli 2002 an 25 Universitäten in Deutschland unterschiedliche Aktivitäten zur Nutzung von Notebooks in der Hochschule gestartet worden. Am Anfang dieser Aktivitäten standen der Aufbau eines Funknetzes (WLAN: Wireless-Local-Area-Access-Network und die Versorgung der Studierenden mit Notebooks. Der inhaltliche Fokus unterscheidet sich dabei an den einzelnen Hochschulen. An einem Teil der Hochschulen widmet man sich der Produktion von Content; neue Lernprogramme bzw. Lernarrangements werden entwickelt, um die Präsenzlehre zu ergänzen oder teilweise zu ersetzen. An anderen Hochschulen steht die Entwicklung einer Lernplattform oder eines Portals im Vordergrund. Der eCampus Duisburg ist eine strategische Initiative der Universität Duisburg-Essen, um digital abbildbare Dienstleistungen in der Lehre und der Verwaltung konsequent über das Inter-/Intranet zu organisieren und den Einsatz von Notebooks in Lehrveranstaltungen zu ermöglichen. Die Initiative wird gemeinsam von Wissenschaftler/innen, zentralen Einrichtungen und der Hochschulverwaltung getragen. Es wird ein intelligenter Übergang zwischen drahtgebundenen und -ungebundenen Services einerseits und die Verknüpfung von bislang getrennten Services andererseits angestrebt. Das Projekt eCampus beinhaltet eine Reihe von Komponenten, die nicht isoliert voneinander gesehen werden dürfen.

  10. Identifying Non-Volatile Data Storage Areas: Unique Notebook Identification Information as Digital Evidence

    Directory of Open Access Journals (Sweden)

    Nikica Budimir

    2007-03-01

    Full Text Available The research reported in this paper introduces new techniques to aid in the identification of recovered notebook computers so they may be returned to the rightful owner. We identify non-volatile data storage areas as a means of facilitating the safe storing of computer identification information. A forensic proof of concept tool has been designed to test the feasibility of several storage locations identified within this work to hold the data needed to uniquely identify a computer. The tool was used to perform the creation and extraction of created information in order to allow the analysis of the non-volatile storage locations as valid storage areas capable of holding and preserving the data created within them.  While the format of the information used to identify the machine itself is important, this research only discusses the insertion, storage and ability to retain such information.

  11. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  12. Examining Elementary Teachers' Identities through Analysis of Student Science Notebooks

    Science.gov (United States)

    Madden, Lauren

    2011-12-01

    mixed-methods analysis of the science notebook entries created by each of the students in this second grade class over the course of the school year. Every entry of every notebook was photographed and coded for: unit (and therefore teacher), inquiry phase (pre-, during-, or post-investigation), and driving force (teacher-driven, student-driven, or balanced). In addition, missing and incomplete notebook entries were also documented. Quantitative analysis looked at the frequency of entries based on these codes. Qualitative data included thematic descriptions of how each teacher used the notebooks, teacher interviews, student interviews of their notebook use and classroom observations. All three teachers used similar curricular materials (kits) and received training from the school district on using science notebooks, suggesting that they would likely use the science notebooks in a similar way. However, quantitative differences were found across all three areas (inquiry phases, driving force, and missing entries), and qualitative analysis also indicated each teacher used the notebooks in a very different way. The teacher identity framework provided a useful way of interpreting these differences. These findings suggest that student science notebook analysis can be used in concert with other data sources through an identity framework to provide information about instruction over the course of a unit or school year, providing more robust analysis than classroom observations and interviews alone.

  13. Lab notebooks as scientific communication: investigating development from undergraduate courses to graduate research

    CERN Document Server

    Stanley, Jacob T

    2016-01-01

    In experimental physics, lab notebooks play an essential role in the research process. For all of the ubiquity of lab notebooks, little formal attention has been paid to addressing what is considered `best practice' for scientific documentation and how researchers come to learn these practices in experimental physics. Using interviews with practicing researchers, namely physics graduate students, we explore the different experiences researchers had in learning how to effectively use a notebook for scientific documentation. We find that very few of those interviewed thought that their undergraduate lab classes successfully taught them the benefit of maintaining a lab notebook. Most described training in lab notebook use as either ineffective or outright missing from their undergraduate lab course experience. Furthermore, a large majority of those interviewed explained that they did not receive any formal training in maintaining a lab notebook during their graduate school experience and received little to no fe...

  14. Cloud hosting of the IPython Notebook to Provide Collaborative Research Environments for Big Data Analysis

    Science.gov (United States)

    Kershaw, Philip; Lawrence, Bryan; Gomez-Dans, Jose; Holt, John

    2015-04-01

    We explore how the popular IPython Notebook computing system can be hosted on a cloud platform to provide a flexible virtual research hosting environment for Earth Observation data processing and analysis and how this approach can be expanded more broadly into a generic SaaS (Software as a Service) offering for the environmental sciences. OPTIRAD (OPTImisation environment for joint retrieval of multi-sensor RADiances) is a project funded by the European Space Agency to develop a collaborative research environment for Data Assimilation of Earth Observation products for land surface applications. Data Assimilation provides a powerful means to combine multiple sources of data and derive new products for this application domain. To be most effective, it requires close collaboration between specialists in this field, land surface modellers and end users of data generated. A goal of OPTIRAD then is to develop a collaborative research environment to engender shared working. Another significant challenge is that of data volume and complexity. Study of land surface requires high spatial and temporal resolutions, a relatively large number of variables and the application of algorithms which are computationally expensive. These problems can be addressed with the application of parallel processing techniques on specialist compute clusters. However, scientific users are often deterred by the time investment required to port their codes to these environments. Even when successfully achieved, it may be difficult to readily change or update. This runs counter to the scientific process of continuous experimentation, analysis and validation. The IPython Notebook provides users with a web-based interface to multiple interactive shells for the Python programming language. Code, documentation and graphical content can be saved and shared making it directly applicable to OPTIRAD's requirements for a shared working environment. Given the web interface it can be readily made into a hosted

  15. On the Absence of the English Learners’ Vocabulary Notebooks

    Institute of Scientific and Technical Information of China (English)

    曲囡囡

    2009-01-01

    <正>The present study is conducted by looking into students’ vocabulary notebooks.The subjects of the research are 44 sophomores from the Foreign Languages Department,Beijing Wuzi University. The vocabulary notebooks they have kept during an entire semester works as the data of the research. The absence of meaningful collocations enhancing the learning of a word in its fullest sense has been detected together with the lack of various sources choosing words to be listed and the failure of identifying the high frequency words.It provides a useful insight into individual learning style and the pedagogical implications are also highly appraised.The findings of the research might be pessimistic but it otherwise promises a field still worth working in.

  16. Developing a notebook protocol for the high school chemistry classroom

    Science.gov (United States)

    Rensing, Roselyn I.

    The focus of this project is to increase science literacy in high school students using a protocol that emphasizes writing. A protocol or lesson sequence comprises a code of behavior to encourage learning through reflection, writing, and self-assessment. A basic protocol may have a sequence of writing elements or tasks which checks for prior knowledge, looks at lesson standards, studies content, and summarizes learning. Using the protocol, students will demonstrate evidence of their learning through writing. The project will identify a progression of tasks which enable students to master content and express mastery through writing. The student's interactive notebook will record evidence of their learning. Several styles of writing and reporting tasks will be explored using the notebook. Students will help implement and identify tasks that demonstrate their knowledge and understanding of science content.

  17. Security studies and Antonio Gramsci’s Prison Notebooks

    OpenAIRE

    Dominik Smyrgała

    2014-01-01

    The article argues that although Antonio Gramsci did not define a new field of research that we could call security studies, his views and ideas on international rela-tions presented in the Prison Notebooks focused around security issues. It may be even stated that his writings anticipated to some extent the birth of security studies after the Second World War – or even the modern theorizing on economic and cultural security.

  18. Security studies and Antonio Gramsci’s Prison Notebooks

    Directory of Open Access Journals (Sweden)

    Dominik Smyrgała

    2014-12-01

    Full Text Available The article argues that although Antonio Gramsci did not define a new field of research that we could call security studies, his views and ideas on international rela-tions presented in the Prison Notebooks focused around security issues. It may be even stated that his writings anticipated to some extent the birth of security studies after the Second World War – or even the modern theorizing on economic and cultural security.

  19. Adding Graphical Interactive FITS Image Interaction to Data Analysis in IPython Notebooks

    Science.gov (United States)

    Jeschke, E.

    2014-05-01

    IPython notebooks are becoming a popular and viable approach for documenting data analysis procedures and helping produce open, reproducible science. Recent developments in the IPython project allow notebooks to be published and viewed on the web, providing a nearly seamless transition from data analysis to publication. In this talk we will review and demonstrate the ipython notebook as a data analysis tool, and show how graphical FITS image interaction can be integrated in the workflow to simplify some cumbersome tasks.

  20. Technology Versus Health: The Occurrence of Muscle-Skeleton Lesions in Undergraduates Using Notebooks

    OpenAIRE

    Vilela Junio, Juscelino Francisco; Associação caruaruense de ensino superior- ASCES; Santos, Jessica Marques dos; ASSOCIAÇÃO CARUARUENSE DE ENSINO SUPERIOR-ASCES; Silva, Rayssa Iracy da; Associação caruaruense de ensino superior-ASCES; Vilela, Juceluce da Silva; FABEJA; Araujo, Evanisia Assis Goes de; Faculdade Associação Caruaruense de Ensino Superior (Faculdade ASCES)

    2015-01-01

    The notebook reached the peak of technological inventions in the early 21st century, featuring compactness and portability. However, excessive use, bad body posture and the ergonomic limitations of the notebook may trigger muscle-skeleton lesions. Current paper investigates, through a descriptive, exploratory, transversal and quantitative study, the occurrence of muscle-skeleton lesions in university students using the notebook. Sample comprised 246 students from a private institution for hig...

  1. A battery-powered notebook thermal cycler for rapid multiplex real-time PCR analysis.

    Science.gov (United States)

    Belgrader, P; Young, S; Yuan, B; Primeau, M; Christel, L A; Pourahmadi, F; Northrup, M A

    2001-01-15

    A compact, real-time PCR instrument was developed for rapid, multiplex analysis of nucleic acids in an inexpensive, portable format. The instrument consists of a notebook computer, two reaction modules with integrated optics for four-color fluorescence detection, batteries, and a battery-charging system. The instrument weighs 3.3 kg, measures 26 x 22 x 7.5 cm, and can run continuously on the internal batteries for 4 h. Independent control of the modules allows differing temperature profiles and detection schemes to be run simultaneously. Results are presented that demonstrate rapid (1) detection and identification of Bacillus subtilis and Bacillus thuringensis spores and (2) characterization of a single nucleotide polymorphism for the hereditary hemochromatosis gene.

  2. Using Student Notebooks to Measure Opportunity to Learn in Botswana and South African Classrooms

    Science.gov (United States)

    Reeves, Cheryl; Major, Thenjiwe

    2012-01-01

    This article describes how a detailed notebook analysis was used to assess and compare the opportunity to learn of a sample of grade 6 students from 126 classes in South East Botswana and North West Province, South Africa. Students' mathematics notebooks provided the main data source for estimating how much time is spent on the subject during the…

  3. Vocabulary Notebook: A Digital Solution to General and Specific Vocabulary Learning Problems in a CLIL Context

    Science.gov (United States)

    Bazo, Plácido; Rodríguez, Romén; Fumero, Dácil

    2016-01-01

    In this paper, we will introduce an innovative software platform that can be especially useful in a Content and Language Integrated Learning (CLIL) context. This tool is called Vocabulary Notebook, and has been developed to solve all the problems that traditional (paper) vocabulary notebooks have. This tool keeps focus on the personalisation of…

  4. Using Academic Notebooks to Support Achievement and Promote Positive Classroom Environments

    Science.gov (United States)

    Rheingold, Alison; LeClair, Caitlin; Seaman, Jayson

    2013-01-01

    Notebooks are commonly used in middle school classrooms as a place for students to record information delivered via lecture, classroom discussion, or independent work. A primary reason teachers ask students to use notebooks is to capture and organize information. In many cases, students are expected to use these tools with little direction,…

  5. LDUA software custodian`s notebook

    Energy Technology Data Exchange (ETDEWEB)

    Aftanas, B.L.

    1998-08-20

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements.

  6. ANALYSIS POSITIONING OF NOTEBOOK PRODUCED BY HEWLETT PACKARD (HP IN BALI PROVINCE

    Directory of Open Access Journals (Sweden)

    Ni Wayan Sri Ariyani

    2009-12-01

    Full Text Available Nowadays notebook is not a luxurious thing any more, but has been one of  the things that should be obligatorily owned by high school student, university student, and practitioners. The competition among the notebooks has been getting tight. The reason is that they have been produced with almost similar attributes, qualities, and design but different prices. This study aims at (1 identifying the similarity between the notebook produced by Hewlett Packard (hp and those produced by its competitors such as, IBM, Sony, Acer and Toshiba, (2 identifying the consumers perception of the notebook produced by Hewlett Packard (hp and of those produced by its competitors, (3 identifying what is the superiority of the notebook produced by Hewlett Packard (hp compared with the superiority of those produced by its competitors, and (4 identifying what strategy is relevant in strengthening the positioning of the notebook produced by Hewlett Packard (hp. The samples were determined by purposive sampling, and the respondents employed totaled 100 spreading over Denpasar City. To achieve maximum result, Multi Dimensional Scaling (MDS was applied as the instrument of analysis to identify the similarity among the notebooks and Correspondence Analysis (CA was employed as the instrument of analysis to identify the superiority of each variable of every notebook. The result of the study show that the notebook produced by Hewlett Packard (hp is perceived to resemble those produced by IBM, Sony, and Toshiba. This means that the notebook produced by Hewlett Packard (hp competes against those produced by IBM, Sony, and Toshiba, and that it does not compete against that produced by Acer. The notebook produced by Hewlett Packard (hp is superior in the variables of product varieties, post sales, distribution channels, and promotion. The notebook produced by IBM superior in the variable product quality, while that produced by Sony is superior in the variable of product  design

  7. TECHNICAL BASIS DOCUMENT FOR AT-POWER SIGNIFICANCE DETERMINATION PROCESS (SDP) NOTEBOOKS.

    Energy Technology Data Exchange (ETDEWEB)

    Azarm, M. A.; Martinez-Guridi, G.; Higgins, J.

    2004-06-01

    To support the assessment of inspection findings as part of the risk-informed inspection in the United States Nuclear Regulatory Commission's (USNRC's) Reactor Oversight Process (ROP), risk inspection notebooks, also called significance determination process (SDP) notebooks, have been developed for each of the operating plants in the United States. These notebooks serve as a tool for assessing risk significance of inspection findings along with providing an engineering understanding of the significance. Plant-specific notebooks are developed to capture plant-specific features, characteristics, and analyses that influence the risk profile of the plant. At the same time, the notebooks follow a consistent set of assumptions and guidelines to assure consistent treatment of inspection findings across the plants. To achieve these objectives, notebooks are designed to provide specific information that are unique both in the manner in which the information is provided and in the way the screening risk assessment is carried out using the information provided. The unique features of the SDP notebooks, the approaches used to present the information for assessment of inspection findings, the assumptions used in consistent modeling across different plants with due credit to plant-specific features and analyses form the technical basis of the SDP notebooks. In this document, the unique features and the technical basis for the notebooks are presented. The types of information that are included and the reasoning/basis for including that information are discussed. The rules and basis for developing the worksheets that are used by the inspectors in the assessment of inspection findings are presented. The approach to modeling plants' responses to different initiating events and specific assumptions/considerations used for each of the reactor types are also discussed.

  8. The pupils’ notebooks as a tool in long-term research and in education

    OpenAIRE

    Studená, Kristina

    2011-01-01

    Czech language education contains the Literary education, its function lies mainly in making the literature of art accessible to pupils via literary excerpts and their interpretations. The literary text interpretation can be thought of in several ways, including the use of pupils´ notebooks. The paper will also show us how the notebooks can be used in literary education, what roles can they fulfill, what forms can they take and what the teacher has to make sure of to make the notebook meth...

  9. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial

    Directory of Open Access Journals (Sweden)

    Adina eHowe

    2015-07-01

    Full Text Available Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs, providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats While numerous tools have been developed based on these methodological concepts, they present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.

  10. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial).

    Science.gov (United States)

    Howe, Adina; Chain, Patrick S G

    2015-01-01

    Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats While numerous tools have been developed based on these methodological concepts, they present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.

  11. Developing and Validating a Science Notebook Rubric for Fifth-Grade Non-Mainstream Students

    Science.gov (United States)

    Huerta, Margarita; Lara-Alecio, Rafael; Tong, Fuhui; Irby, Beverly J.

    2014-07-01

    We present the development and validation of a science notebook rubric intended to measure the academic language and conceptual understanding of non-mainstream students, specifically fifth-grade male and female economically disadvantaged Hispanic English language learner (ELL) and African-American or Hispanic native English-speaking students. The science notebook rubric is based on two main constructs: academic language and conceptual understanding. The constructs are grounded in second-language acquisition theory and theories of writing and conceptual understanding. We established content validity and calculated reliability measures using G theory and percent agreement (for comparison) with a sample of approximately 144 unique science notebook entries and 432 data points. Results reveal sufficient reliability estimates, indicating that the instrument is promising for use in future research studies including science notebooks in classrooms with populations of economically disadvantaged Hispanic ELL and African-American or Hispanic native English-speaking students.

  12. Exercising autonomous learning approaches through interactive notebooks: a qualitative case study

    National Research Council Canada - National Science Library

    Jaladanki, Vani S; Bhattacharya, Kakali

    2014-01-01

    ... using interactive notebooks to inform students' understanding of physics concepts. The participant for the study was purposefully selected with an intention to gain an in-depth understanding of the experiences...

  13. Lab notebooks as scientific communication: Investigating development from undergraduate courses to graduate research

    Science.gov (United States)

    Stanley, Jacob T.; Lewandowski, H. J.

    2016-12-01

    In experimental physics, lab notebooks play an essential role in the research process. For all of the ubiquity of lab notebooks, little formal attention has been paid to addressing what is considered "best practice" for scientific documentation and how researchers come to learn these practices in experimental physics. Using interviews with practicing researchers, namely, physics graduate students, we explore the different experiences researchers had in learning how to effectively use a notebook for scientific documentation. We find that very few of those interviewed thought that their undergraduate lab classes successfully taught them the benefit of maintaining a lab notebook. Most described training in lab notebook use as either ineffective or outright missing from their undergraduate lab course experience. Furthermore, a large majority of those interviewed explained that they did not receive any formal training in maintaining a lab notebook during their graduate school experience and received little to no feedback from their advisors on these records. Many of the interviewees describe learning the purpose of, and how to maintain, these kinds of lab records only after having a period of trial and error, having already started doing research in their graduate program. Despite the central role of scientific documentation in the research enterprise, these physics graduate students did not gain skills in documentation through formal instruction, but rather through informal hands-on practice.

  14. Integration of ROOT notebook as an ATLAS analysis web-based tool in outreach and public data release projects

    CERN Document Server

    Sanchez Pineda, Arturo; The ATLAS collaboration

    2016-01-01

    Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.

  15. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  16. The two dimensional fold test in paleomagnetism using ipython notebook

    Science.gov (United States)

    Setiabudidaya, Dedi; Piper, John D. A.

    2016-01-01

    One aspect of paleomagnetic analysis prone to controversy is the result of the fold test used to evaluate the age of a magnetisation component relative to the age of a structural event. Initially, the fold test was conducted by comparing the Fisherian precision parameter (k) to results from different limbs of a fold structure before and after tilt adjustment. To accommodate synfolding magnetisation, the tilt correction can be performed in stepwise fashion to both limbs simultaneously, here called one dimensional (1D) fold test. The two dimensional (2D) fold test described in this paper is carried out by applying stepwise tilt adjustment to each limb of the fold separately. The rationale for this is that tilts observed on contrasting limbs of deformed structure may not be synchronous or even belong to the same episode of deformation. A program for the procedure is presented here which generates two dimensional values of the k-parameter visually presented in contoured form. The use of ipython notebook enables this 2D fold test to be performed interactively and yield a more precise evaluation than the primitive 1D fold test.

  17. Electronic laboratory notebooks in a public–private partnership

    Directory of Open Access Journals (Sweden)

    Lea A.I. Vaas

    2016-09-01

    Full Text Available This report shares the experience during selection, implementation and maintenance phases of an electronic laboratory notebook (ELN in a public–private partnership project and comments on user’s feedback. In particular, we address which time constraints for roll-out of an ELN exist in granted projects and which benefits and/or restrictions come with out-of-the-box solutions. We discuss several options for the implementation of support functions and potential advantages of open access solutions. Connected to that, we identified willingness and a vivid culture of data sharing as the major item leading to success or failure of collaborative research activities. The feedback from users turned out to be the only angle for driving technical improvements, but also exhibited high efficiency. Based on these experiences, we describe best practices for future projects on implementation and support of an ELN supporting a diverse, multidisciplinary user group based in academia, NGOs, and/or for-profit corporations located in multiple time zones.

  18. PDS MSL Analyst's Notebook: Supporting Active Rover Missions and Adding Value to Planetary Data Archives

    Science.gov (United States)

    Stein, Thomas

    Planetary data archives of surface missions contain data from numerous hosted instruments. Because of the nondeterministic nature of surface missions, it is not possible to assess the data without understanding the context in which they were collected. The PDS Analyst’s Notebook (http://an.rsl.wustl.edu) provides access to Mars Science Laboratory (MSL) data archives by integrating sequence information, engineering and science data, observation planning and targeting, and documentation into web-accessible pages to facilitate “mission replay.” In addition, Mars Exploration Rover (MER), Mars Phoenix Lander, Lunar Apollo surface mission, and LCROSS mission data are available in the Analyst’s Notebook concept, and a Notebook is planned for the Insight mission. The MSL Analyst’s Notebook contains data, documentation, and support files for the Curiosity rovers. The inputs are incorporated on a daily basis into a science team version of the Notebook. The public version of the Analyst’s Notebook is comprised of peer-reviewed, released data and is updated coincident with PDS data releases as defined in mission archive plans. The data are provided by the instrument teams and are supported by documentation describing data format, content, and calibration. Both operations and science data products are included. The operations versions are generated to support mission planning and operations on a daily basis. They are geared toward researchers working on machine vision and engineering operations. Science versions of observations from some instruments are provided for those interested in radiometric and photometric analyses. Both data set documentation and sol (i.e., Mars day) documents are included in the Notebook. The sol documents are the mission manager and documentarian reports that provide a view into science operations—insight into why and how particular observations were made. Data set documents contain detailed information regarding the mission, spacecraft

  19. Blog-based research notebook: Personal informatics workbench for high-throughput experimentation

    Science.gov (United States)

    Todoroki, Shin-ichi; Konishi, Tomoya; Inoue, Satoru

    2006-01-01

    In this age of information technology, many researchers are still conservative in keeping a log of their activities in paper-based notebooks. This style of log-keeping brings about the situation that our experimental data and their descriptions are recorded separately into hard disks and papers, respectively. Such a data separation is likely to be a serious rate-limiting factor in high-throughput experimentation from the view point of getting feedback on each researcher's work from what he has done. We propose to utilize a blog (Weblog) as an electronic research notebook and discuss technical requirements for maintaining it, on the basis of the blogging experience for 4 years by one of the authors. We need a user-installed blog server with authentication function for personalization and network infrastructure enabling us to "blog anytime, anywhere". Although some knowledge-sharing systems have similar electronic notebooks as their front-end, the present blog system is different from these because it stores personal information which is not meant to be shared with others. This blog-based notebook cooperates with these e-notebooks by promoting hyperlinks among their contents, and acts as a personal informatics workbench providing connections to all the resources needed.

  20. Developing an Audiovisual Notebook as a Self-Learning Tool in Histology: Perceptions of Teachers and Students

    Science.gov (United States)

    Campos-Sánchez, Antonio; López-Núñez, Juan-Antonio; Scionti, Giuseppe; Garzón, Ingrid; González-Andrades, Miguel; Alaminos, Miguel; Sola, Tomás

    2014-01-01

    Videos can be used as didactic tools for self-learning under several circumstances, including those cases in which students are responsible for the development of this resource as an audiovisual notebook. We compared students' and teachers' perceptions regarding the main features that an audiovisual notebook should include. Four…

  1. Developing an Audiovisual Notebook as a Self-Learning Tool in Histology: Perceptions of Teachers and Students

    Science.gov (United States)

    Campos-Sánchez, Antonio; López-Núñez, Juan-Antonio; Scionti, Giuseppe; Garzón, Ingrid; González-Andrades, Miguel; Alaminos, Miguel; Sola, Tomás

    2014-01-01

    Videos can be used as didactic tools for self-learning under several circumstances, including those cases in which students are responsible for the development of this resource as an audiovisual notebook. We compared students' and teachers' perceptions regarding the main features that an audiovisual notebook should include. Four…

  2. Une lecture, mille et une reflexions: Cahier de reflexion sur le processus de lecture. (One Reading, 1001 Reflections: Notebook of Reflection on the Reading Process.)

    Science.gov (United States)

    Alberta Dept. of Education, Edmonton.

    This activity notebook is intended to help French-speaking students in Alberta, Canada, develop reflective reading practices. Following an introduction and information (with graphics) on the notebook's organization, the notebook is divided into three sections of reading strategies: the first section contains three activities, the second section…

  3. Teaching numerical methods with IPython notebooks and inquiry-based learning

    KAUST Repository

    Ketcheson, David I.

    2014-01-01

    A course in numerical methods should teach both the mathematical theory of numerical analysis and the craft of implementing numerical algorithms. The IPython notebook provides a single medium in which mathematics, explanations, executable code, and visualizations can be combined, and with which the student can interact in order to learn both the theory and the craft of numerical methods. The use of notebooks also lends itself naturally to inquiry-based learning methods. I discuss the motivation and practice of teaching a course based on the use of IPython notebooks and inquiry-based learning, including some specific practical aspects. The discussion is based on my experience teaching a Masters-level course in numerical analysis at King Abdullah University of Science and Technology (KAUST), but is intended to be useful for those who teach at other levels or in industry.

  4. SMART NOTEBOOK AS AN ICT WAY FОR DEVELOPMENT OF RESEARCH COMPETENCE

    Directory of Open Access Journals (Sweden)

    Svitlana V. Vasylenko

    2014-05-01

    Full Text Available The article discusses the benefits of the educational process for general and higher education through the development of research competence of students, information and communication technology training. These technologies are used in many areas of activity, including updated content of education, implementing of distance learning, introducing new forms of collaboration. Attention is accented on the features using SMART Notebook software for organizing the learning process in the form of interactive sessions, clarified a basic arguments to use SMART Notebook for creation an author's educational resources, to orient teachers to construct their personal innovative methodical system.

  5. Benefits briefing notebook: The secondary application of aerospace technology in other sectors of the economy

    Science.gov (United States)

    1976-01-01

    Resource information on the transfer of aerospace technology to other sectors of the U.S. economy is presented. The contents of this notebook are divided into three sections: (1) benefit cases, (2) transfer overview, and (3) indexes. Transfer examples relevant to each subject area are presented. Pertinent transfer data are given. The Transfer Overview section provides a general perspective for technology transfer from NASA to other organizations. In addition to a description of the basic transfer modes, the selection criteria for notebook examples and the kinds of benefit data they contain are also presented.

  6. A Performance Evaluation of a Notebook PC under a High Dose-Rate Gamma Ray Irradiation Test

    Directory of Open Access Journals (Sweden)

    Jai Wan Cho

    2014-01-01

    Full Text Available We describe the performance of a notebook PC under a high dose-rate gamma ray irradiation test. A notebook PC, which is small and light weight, is generally used as the control unit of a robot system and loaded onto the robot body. Using TEPCO’s CAMS (containment atmospheric monitoring system data, the gamma ray dose rate before and after a hydrogen explosion in reactor units 1–3 of the Fukushima nuclear power plant was more than 150 Gy/h. To use a notebook PC as the control unit of a robot system entering a reactor building to mitigate the severe accident situation of a nuclear power plant, the performance of the notebook PC under such intense gamma-irradiation fields should be evaluated. Under a similar dose-rate (150 Gy/h gamma ray environment, the performances of different notebook PCs were evaluated. In addition, a simple method for a performance evaluation of a notebook PC under a high dose-rate gamma ray irradiation test is proposed. Three notebook PCs were tested to verify the method proposed in this paper.

  7. The Roles of Engineering Notebooks in Shaping Elementary Engineering Student Discourse and Practice

    Science.gov (United States)

    Hertel, Jonathan D.; Cunningham, Christine M.; Kelly, Gregory J.

    2017-01-01

    Engineering design challenges offer important opportunities for students to learn science and engineering knowledge and practices. This study examines how students' engineering notebooks across four units of the curriculum "Engineering is Elementary" (EiE) support student work during design challenges. Through educational ethnography and…

  8. Writing Material in Chemical Physics Research: The Laboratory Notebook as Locus of Technical and Textual Integration

    Science.gov (United States)

    Wickman, Chad

    2010-01-01

    This article, drawing on ethnographic study in a chemical physics research facility, explores how notebooks are used and produced in the conduct of laboratory science. Data include written field notes of laboratory activity; visual documentation of "in situ" writing processes; analysis of inscriptions, texts, and material artifacts produced in the…

  9. Impact of the implementation of a well-designed electronic laboratory notebook on bioanalytical laboratory function.

    Science.gov (United States)

    Zeng, Jianing; Hillman, Mark; Arnold, Mark

    2011-07-01

    This paper shares experiences of the Bristol-Myers Squibb Company during the design, validation and implementation of an electronic laboratory notebook (ELN) into the GLP/regulated bioanalytical analysis area, as well as addresses the impact on bioanalytical laboratory functions with the implementation of the electronic notebook. Some of the key points covered are: knowledge management - the project-based electronic notebook takes full advantage of the available technology that focuses on data organization and sharing so that scientific data generated by individual scientists became department knowledge; bioanalytical workflows in the ELN - the custom-built workflows that include data entry templates, validated calculation processes, integration with laboratory information management systems/laboratory instruments, and reporting capability improve the data quality and overall workflow efficiency; regulatory compliance - carefully designed notebook reviewing processes, cross referencing of distributed information, audit trail and software validation reduce compliance risks. By taking into consideration both data generation and project documentation needs, a well-designed ELN can deliver significant improvements in laboratory efficiency, work productivity, and regulatory compliance.

  10. Writing Material in Chemical Physics Research: The Laboratory Notebook as Locus of Technical and Textual Integration

    Science.gov (United States)

    Wickman, Chad

    2010-01-01

    This article, drawing on ethnographic study in a chemical physics research facility, explores how notebooks are used and produced in the conduct of laboratory science. Data include written field notes of laboratory activity; visual documentation of "in situ" writing processes; analysis of inscriptions, texts, and material artifacts produced in the…

  11. Jupyter meets Earth: Creating Comprehensible and Reproducible Scientific Workflows with Jupyter Notebooks and Google Earth Engine

    Science.gov (United States)

    Erickson, T.

    2016-12-01

    Deriving actionable information from Earth observation data obtained from sensors or models can be quite complicated, and sharing those insights with others in a form that they can understand, reproduce, and improve upon is equally difficult. Journal articles, even if digital, commonly present just a summary of an analysis that cannot be understood in depth or reproduced without major effort on the part of the reader. Here we show a method of improving scientific literacy by pairing a recently developed scientific presentation technology (Jupyter Notebooks) with a petabyte-scale platform for accessing and analyzing Earth observation and model data (Google Earth Engine). Jupyter Notebooks are interactive web documents that mix live code with annotations such as rich-text markup, equations, images, videos, hyperlinks and dynamic output. Notebooks were first introduced as part of the IPython project in 2011, and have since gained wide acceptance in the scientific programming community, initially among Python programmers but later by a wide range of scientific programming languages. While Jupyter Notebooks have been widely adopted for general data analysis, data visualization, and machine learning, to date there have been relatively few examples of using Jupyter Notebooks to analyze geospatial datasets. Google Earth Engine is cloud-based platform for analyzing geospatial data, such as satellite remote sensing imagery and/or Earth system model output. Through its Python API, Earth Engine makes petabytes of Earth observation data accessible, and provides hundreds of algorithmic building blocks that can be chained together to produce high-level algorithms and outputs in real-time. We anticipate that this technology pairing will facilitate a better way of creating, documenting, and sharing complex analyses that derive information on our Earth that can be used to promote broader understanding of the complex issues that it faces. http://jupyter.orghttps://earthengine.google.com

  12. Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook

    Science.gov (United States)

    Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.

    2012-12-01

    The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system

  13. A suite of Mathematica notebooks for the analysis of protein main chain 15N NMR relaxation data.

    Science.gov (United States)

    Spyracopoulos, Leo

    2006-12-01

    A suite of Mathematica notebooks has been designed to ease the analysis of protein main chain 15N NMR relaxation data collected at a single magnetic field strength. Individual notebooks were developed to perform the following tasks: nonlinear fitting of 15N-T1 and -T2 relaxation decays to a two parameter exponential decay, calculation of the principal components of the inertia tensor from protein structural coordinates, nonlinear optimization of the principal components and orientation of the axially symmetric rotational diffusion tensor, model-free analysis of 15N-T1, -T2, and {1H}-15N NOE data, and reduced spectral density analysis of the relaxation data. The principle features of the notebooks include use of a minimal number of input files, integrated notebook data management, ease of use, cross-platform compatibility, automatic visualization of results and generation of high-quality graphics, and output of analyses in text format.

  14. Evaluating Stream Filtering for Entity Profile Updates in TREC 2012, 2013, and 2014 (KBA Track Overview, Notebook Paper)

    Science.gov (United States)

    2014-11-01

    1   Evaluating Stream Filtering for Entity Profile Updates in TREC 2012, 2013, and 2014   (KBA Track Overview, Notebook Paper)   John R... analysis – Language parsing and understanding   General Terms: Experimentation, Measurement   Introduction   This overview paper describes the...Track Overview, Notebook Paper) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER

  15. Requirement analysis for an electronic laboratory notebook for sustainable data management in biomedical research.

    Science.gov (United States)

    Menzel, Julia; Weil, Philipp; Bittihn, Philip; Hornung, Daniel; Mathieu, Nadine; Demiroglu, Sara Y

    2013-01-01

    Sustainable data management in biomedical research requires documentation of metadata for all experiments and results. Scientists usually document research data and metadata in laboratory paper notebooks. An electronic laboratory notebook (ELN) can keep metadata linked to research data resulting in a better understanding of the research results, meaning a scientific benefit [1]. Besides other challenges [2], the biggest hurdles for introducing an ELN seem to be usability, file formats, and data entry mechanisms [3] and that many ELNs are assigned to specific research fields such as biology, chemistry, or physics [4]. We aimed to identify requirements for the introduction of ELN software in a biomedical collaborative research center [5] consisting of different scientific fields and to find software fulfilling most of these requirements.

  16. Conceptions of Hegemony in Antonio Gramsci’s Southern Question and the Prison Notebooks

    OpenAIRE

    Ercan Gündoğan

    2008-01-01

    The article focuses on Antonio Gramsci’s Southern Question and the Prison Notebooks and tries to demonstrate that he just re-theorises the formative stages of class power beginning from economic relations to political power, in other words, ruling class power developing from civil hegemony into political hegemony along the lines of classical Marxist texts. For Gramsci, hegemony does not only refer to ideological and cultural leadership of the ruling groups and classes over the allies, but als...

  17. The retextualization of the genre Notebook of Reality in the Pedagogy of Alternation

    OpenAIRE

    Cicero da Silva; Karylleila dos Santos Andrade (UFT); Flavio Moreira

    2015-01-01

    This work has the purpose to discuss constitutive aspects of the genre Notebook of Reality (NR) and its process of retextualization. The approach of the investigated object is from a perspective of interpretative analysis, since it is a qualitative research based on studies of discourse genres (BAKHTIN, 2006), with emphasis on retextualization (DELL’ISOLA, 2007; MARCUSCHI, 2007). It is an exploratory and descriptive research, with documental data collection. The sample consists of texts of 01...

  18. OpenLabNotes - An Electronic Laboratory Notebook Extension for OpenLabFramework

    OpenAIRE

    List, Markus; Franz, Michael; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2015-01-01

    Electronic laboratory notebooks (ELNs) are more accessible and reliable than their paper based alternatives and thus find widespread adoption. While a large number of commercial products is available, small- to mid-sized laboratories can often not afford the costs or are concerned about the longevity of the providers. Turning towards free alternatives, however, raises questions about data protection, which are not sufficiently addressed by available solutions. To serve as legal documents, ELN...

  19. Lesson Activity Toolkit v Smart Notebook pro interaktivní tabule

    OpenAIRE

    HAJDUCH, Petr

    2013-01-01

    Lesson Activity toolkit is a collection of mini applications (applets) in the Smart Notebook software. It serves for creating educational materials with the use of the interactive board. It is a flash application which can be loaded with an extra content. This Bachelor's thesis focuses on these applications - it will elaborate on how these activities are used in contemporary education, how to work with them and which modules are not present in the gallery and could expand it efficiently. It a...

  20. The retextualization of the genre Notebook of Reality in the Pedagogy of Alternation

    Directory of Open Access Journals (Sweden)

    Cicero da Silva

    2015-10-01

    Full Text Available This work has the purpose to discuss constitutive aspects of the genre Notebook of Reality (NR and its process of retextualization. The approach of the investigated object is from a perspective of interpretative analysis, since it is a qualitative research based on studies of discourse genres (BAKHTIN, 2006, with emphasis on retextualization (DELL’ISOLA, 2007; MARCUSCHI, 2007. It is an exploratory and descriptive research, with documental data collection. The sample consists of texts of 01 (one copy of the Notebook of Reality (NR, a didactic-pedagogic tool of the educational units that adopt the Pedagogy of Alternation (PA. The texts were produced by 01 (one student of the 9th grade of a school named Escola Família Agrícola (EFA, situated in a city of Tocantins. The research takes into account the theoretical and methodological conceptions of the PA context, as well as the discursive approach of the genre Notebook of Reality (NR focusing on the activity of retextualization. On the retextualizations we focused on the transformation of written genres, the transposition of the contents of a text to another, as well as the genre change. Thus, we seek to identify and characterize the texts of NR produced from the retextualization process as well as the required pedagogical practices.

  1. Drawing and Writing in Digital Science Notebooks: Sources of Formative Assessment Data

    Science.gov (United States)

    Shelton, Angi; Smith, Andrew; Wiebe, Eric; Behrle, Courtney; Sirkin, Ruth; Lester, James

    2016-06-01

    Formative assessment strategies are used to direct instruction by establishing where learners' understanding is, how it is developing, informing teachers and students alike as to how they might get to their next set of goals of conceptual understanding. For the science classroom, one rich source of formative assessment data about scientific thinking and practice is in notebooks used during inquiry-oriented activities. In this study, the goal was to better understand how student knowledge was distributed between student drawings and writings about magnetism in notebooks, and how these findings might inform formative assessment strategies. Here, drawing and writing samples were extracted and evaluated from our digital science notebook, with embedded content and laboratories. Three drawings and five writing samples from 309 participants were analyzed using a common ten-dimensional rubric. Descriptive and inferential statistics revealed that fourth-grade student understanding of magnetism was distributed unevenly between writing and drawing. Case studies were then presented for two exemplar students. Based on the rubric we developed, students were able to articulate more of their knowledge through the drawing activities than through written word, but the combination of the two mediums provided the richest understanding of student conceptions and how they changed over the course of their investigations.

  2. IPython: components for interactive and parallel computing across disciplines. (Invited)

    Science.gov (United States)

    Perez, F.; Bussonnier, M.; Frederic, J. D.; Froehle, B. M.; Granger, B. E.; Ivanov, P.; Kluyver, T.; Patterson, E.; Ragan-Kelley, B.; Sailer, Z.

    2013-12-01

    Scientific computing is an inherently exploratory activity that requires constantly cycling between code, data and results, each time adjusting the computations as new insights and questions arise. To support such a workflow, good interactive environments are critical. The IPython project (http://ipython.org) provides a rich architecture for interactive computing with: 1. Terminal-based and graphical interactive consoles. 2. A web-based Notebook system with support for code, text, mathematical expressions, inline plots and other rich media. 3. Easy to use, high performance tools for parallel computing. Despite its roots in Python, the IPython architecture is designed in a language-agnostic way to facilitate interactive computing in any language. This allows users to mix Python with Julia, R, Octave, Ruby, Perl, Bash and more, as well as to develop native clients in other languages that reuse the IPython clients. In this talk, I will show how IPython supports all stages in the lifecycle of a scientific idea: 1. Individual exploration. 2. Collaborative development. 3. Production runs with parallel resources. 4. Publication. 5. Education. In particular, the IPython Notebook provides an environment for "literate computing" with a tight integration of narrative and computation (including parallel computing). These Notebooks are stored in a JSON-based document format that provides an "executable paper": notebooks can be version controlled, exported to HTML or PDF for publication, and used for teaching.

  3. Assured Access/Mobile Computing Initiatives on Five University Campuses.

    Science.gov (United States)

    Blurton, Craig; Chee, Yam San; Long, Phillip D.; Resmer, Mark; Runde, Craig

    Mobile computing and assured access are becoming popular terms to describe a growing number of university programs which take advantage of ubiquitous network access points and the portability of notebook computers to ensure all students have access to digital tools and resources. However, the implementation of such programs varies widely from…

  4. Notebook Positioning by Perceptual Map and Laddering Method

    Directory of Open Access Journals (Sweden)

    Mehdi Zaribaf

    2012-01-01

    Full Text Available The purpose of this applied paper is to determine positioning of laptops by perceptual map and laddering method. Data collection method was survey using questionnaire. First by using the views of computer experts of selected brands, including Sony, Acer, Asus, Dell and Msi, 5 among 20 attributes, which were considered more important, were selected through application of factor analysis technique. Then, after classification of these features by laddering technique, they were analyzed by using another questionnaire to determine the position of each brand in the perceptual map. We try to show factors and laptop brands position with regard two factors of price and quality in the resulted perceptual map.

  5. Universal Design for Learning and Elementary School Science: Exploring the Efficacy, Use, and Perceptions of a Web-Based Science Notebook

    Science.gov (United States)

    Rappolt-Schlichtmann, Gabrielle; Daley, Samantha G.; Lim, Seoin; Lapinski, Scott; Robinson, Kristin H.; Johnson, Mindy

    2013-01-01

    Science notebooks can play a critical role in activity-based science learning, but the tasks of recording, organizing, analyzing, and interpreting data create barriers that impede science learning for many students. This study (a) assessed in a randomized controlled trial the potential for a web-based science notebook designed using the Universal…

  6. Communication of the monitoring and evaluation process through the use of storyboards and story notebooks.

    Science.gov (United States)

    Lewis, L C; Honea, S H; Kanter, D F; Haney, P E

    1993-10-01

    In preparation for the 1993 Joint Commission on Accreditation of Health Care Organizations (JCAHO) survey, Audie L. Murphy Memorial Veterans Hospital Nursing Service was faced with determining the best approach to presenting their Total Quality Improvement/Total Quality Management (TQI/TQM) process. Nursing Service management and staff, Quality Improvement Clinicians, and medical staff used the Storyboard concept and the accompanying Story Notebooks to organize and to communicate their TQI/TQM process and findings. This concept was extremely beneficial, enabling staff to successfully present the multidisciplinary TQI/TQM data to the JCAHO surveyors.

  7. Two Combinatorial Proofs, and a Generalization, of an Identity from Ramanujan's Lost Notebook

    CERN Document Server

    Levande, Paul

    2009-01-01

    We examine an identity originally stated in Ramanujan's "lost notebook" and first proven, algebraically, by Andrews. We first show that the identity can be rewritten, using minor algebraic manipulation, into an identity that can be proven with a direct bijection. We provide such a bijection using a generalization of a standard bijection from partition theory. We then show that this proof implies further generalizations of the original identity, and that when these generalizations are written in the identity's original form, they can be proven using the involution principle. We give such an involution explicitly, which also provides an involution-based proof of the original identity.

  8. An Analysis Of Leading Character’s Conflict In Nicholas Sparks’ Novel The Notebook

    OpenAIRE

    Simamora, Agreny Melisa

    2014-01-01

    The title of this thesis is An Analysis of Leading Character’s Conflict In Nicholas Sparks’ The Notebook which is a kind of Allie’s conflict in her life. The conflict is devided into three such as Allie’s internal conflict, Allie’s external conflict with Noah (her boyfriend) and Allie’s external conflict with her parents. The conflict can happen because the choices cannot be fullfilled, where Allie’s parents do not agree with her decision because different status, Allie comes from a rich fami...

  9. Personalised Medical Reference to General Practice Notebook (GPnotebook - an evolutionary tale

    Directory of Open Access Journals (Sweden)

    James McMorran

    2002-09-01

    What has happened to this resource now? This brief paper outlines how the developers of the reference resource have improved on the design and content of the medical database. Now the reference resource is an Internet-based resource called General Practice Notebook (www.gpnotebook.co.uk and is currently attracting 5000 to 9000 page views per day and containing over 30 000 index terms in a complex web structure of over 60 000 links. This paper describes the evolutionary process that has occurred over the last decade.

  10. «La bellezza è un sentimento istintivo». L’estetico nei Notebooks darwiniani

    Directory of Open Access Journals (Sweden)

    Lorenzo Bartalesi

    2012-12-01

    Full Text Available From Charles Darwin, the theoretical framework of evolutionary aesthetics is sexual selection. Recent debate focuses the attention particularly on the criterion of female choice. The aim of this article is to sketch a Darwinian way to aesthetics complementary to the one that Darwin himself present in The descent of man (1871. A series of notes in the Darwin's notebooks traditionally known as “Metaphysical Enquiries” will constitute the point of departure for a hypothetical reconstruction of evolutionary history of aesthetic

  11. Analysis and Implementation of an Electronic Laboratory Notebook in a Biomedical Research Institute.

    Science.gov (United States)

    Guerrero, Santiago; Dujardin, Gwendal; Cabrera-Andrade, Alejandro; Paz-Y-Miño, César; Indacochea, Alberto; Inglés-Ferrándiz, Marta; Nadimpalli, Hima Priyanka; Collu, Nicola; Dublanche, Yann; De Mingo, Ismael; Camargo, David

    2016-01-01

    Electronic laboratory notebooks (ELNs) will probably replace paper laboratory notebooks (PLNs) in academic research due to their advantages in data recording, sharing and security. Despite several reports describing technical characteristics of ELNs and their advantages over PLNs, no study has directly tested ELN performance among researchers. In addition, the usage of tablet-based devices or wearable technology as ELN complements has never been explored in the field. To implement an ELN in our biomedical research institute, here we first present a technical comparison of six ELNs using 42 parameters. Based on this, we chose two ELNs, which were tested by 28 scientists for a 3-month period and by 80 students via hands-on practical exercises. Second, we provide two survey-based studies aimed to compare these two ELNs (PerkinElmer Elements and Microsoft OneNote) and to analyze the use of tablet-based devices. We finally explore the advantages of using wearable technology as ELNs tools. Among the ELNs tested, we found that OneNote presents almost all parameters evaluated (39/42) and both surveyed groups preferred OneNote as an ELN solution. In addition, 80% of the surveyed scientists reported that tablet-based devices improved the use of ELNs in different respects. We also describe the advantages of using OneNote application for Apple Watch as an ELN wearable complement. This work defines essential features of ELNs that could be used to improve ELN implementation and software development.

  12. Development of a prediction model on the acceptance of electronic laboratory notebooks in academic environments.

    Science.gov (United States)

    Kloeckner, Frederik; Farkas, Robert; Franken, Tobias; Schmitz-Rode, Thomas

    2014-04-01

    Documentation of research data plays a key role in the biomedical engineering innovation processes. It makes an important contribution to the protection of intellectual property, the traceability of results and fulfilling the regulatory requirement. Because of the increasing digitalization in laboratories, an electronic alternative to the commonly-used paper-bound notebooks could contribute to the production of sophisticated documentation. However, compared to in an industrial environment, the use of electronic laboratory notebooks is not widespread in academic laboratories. Little is known about the acceptance of an electronic documentation system and the underlying reasons for this. Thus, this paper aims to establish a prediction model on the potential preference and acceptance of scientists either for paper-based or electronic documentation. The underlying data for the analysis originate from an online survey of 101 scientists in industrial, academic and clinical environments. Various parameters were analyzed to identify crucial factors for the system preference using binary logistic regression. The analysis showed significant dependency between the documentation system preference and the supposed workload associated with the documentation system (pnotebook before implementation.

  13. Interacting with notebook input devices: an analysis of motor performance and users' expertise.

    Science.gov (United States)

    Sutter, Christine; Ziefle, Martina

    2005-01-01

    In the present study the usability of two different types of notebook input devices was examined. The independent variables were input device (touchpad vs. mini-joystick) and user expertise (expert vs. novice state). There were 30 participants, of whom 15 were touchpad experts and the other 15 were mini-joystick experts. The experimental tasks were a point-click task (Experiment 1) and a point-drag-drop task (Experiment 2). Dependent variables were the time and accuracy of cursor control. To assess carryover effects, we had the participants complete both experiments, using not only the input device for which they were experts but also the device for which they were novices. Results showed the touchpad performance to be clearly superior to mini-joystick performance. Overall, experts showed better performance than did novices. The significant interaction of input device and expertise showed that the use of an unknown device is difficult, but only for touchpad experts, who were remarkably slower and less accurate when using a mini-joystick. Actual and potential applications of this research include an evaluation of current notebook input devices. The outcomes allow ergonomic guidelines to be derived for optimized usage and design of the mini-joystick and touchpad devices.

  14. Segmenting Notebook PC market based on life style (The case of Ferdowsi University students

    Directory of Open Access Journals (Sweden)

    Mostafa Kazemi

    2014-05-01

    Full Text Available This study is aimed to segment the Notebook PC market using VALS life style and to find out the differences between these segments. Vals specifically defines the consumer segments based on personality traits influencing the market behaviors.According to Vals, consumers are divided to eight categories not only based on psychological variables, but also income, education level, buying demands, and some other factors.Life style and values are determinant factors of consumer's decision-making. This study seeks to identify different segments of Notebook PC market and specify the requirements of each segment.To do this, we designed a written questionnaire, distributed 400 questionnaires between the students of Ferdowsi University of Mashhad, and collected 308 ones. The final response rate after eliminating five non-eligible questionnaires was 77%. For data analyzingand answering the research questions, descriptive and inferential statistics were used. The results indicated that each market segment requires different features and requirements as for its psychological attributes. At the end, research limitations were enumeratedand some suggestions for future studies were made.

  15. The Effects of Semantic Mapping, Thematic Clustering, and Notebook Keeping on L2 Vocabulary Recognition and Production

    Science.gov (United States)

    Zarei, Abbas Ali; Adami, Saba

    2013-01-01

    To investigate the effects of semantic mapping, thematic clustering, and notebook keeping on L2 vocabulary recognition and production, four groups of intermediate level learners in an EFL institute in Zanjan, Iran participated in the study. Three experimental groups consisted of semantic mapping, semantic feature analysis, and vocabulary notebook…

  16. The Impact of Science Notebook Writing on ELL and Low-SES Students' Science Language Development and Conceptual Understanding

    Science.gov (United States)

    Huerta, Margarita

    2013-01-01

    This quantitative study explored the impact of literacy integration in a science inquiry classroom involving the use of science notebooks on the academic language development and conceptual understanding of students from diverse (i.e., English Language Learners, or ELLs) and low socio-economic status (low-SES) backgrounds. The study derived from a…

  17. Developing an audiovisual notebook as a self-learning tool in histology: perceptions of teachers and students.

    Science.gov (United States)

    Campos-Sánchez, Antonio; López-Núñez, Juan-Antonio; Scionti, Giuseppe; Garzón, Ingrid; González-Andrades, Miguel; Alaminos, Miguel; Sola, Tomás

    2014-01-01

    Videos can be used as didactic tools for self-learning under several circumstances, including those cases in which students are responsible for the development of this resource as an audiovisual notebook. We compared students' and teachers' perceptions regarding the main features that an audiovisual notebook should include. Four questionnaires with items about information, images, text and music, and filmmaking were used to investigate students' (n = 115) and teachers' perceptions (n = 28) regarding the development of a video focused on a histological technique. The results show that both students and teachers significantly prioritize informative components, images and filmmaking more than text and music. The scores were significantly higher for teachers than for students for all four components analyzed. The highest scores were given to items related to practical and medically oriented elements, and the lowest values were given to theoretical and complementary elements. For most items, there were no differences between genders. A strong positive correlation was found between the scores given to each item by teachers and students. These results show that both students' and teachers' perceptions tend to coincide for most items, and suggest that audiovisual notebooks developed by students would emphasize the same items as those perceived by teachers to be the most relevant. Further, these findings suggest that the use of video as an audiovisual learning notebook would not only preserve the curricular objectives but would also offer the advantages of self-learning processes. © 2013 American Association of Anatomists.

  18. Teacher Assessment of A-Level Biology Practical Notebooks--The Development of a System of Moderation.

    Science.gov (United States)

    Kingdom, J. M.; Hartley, D. J.

    1980-01-01

    From June 1980 onwards most home candidates taking University of London Advanced-level Biology are required to submit their practical and field work notebooks to their teachers for assessment. This paper describes a trial run assessment of the practical books of 700 candidates, conducted in June 1979, and the statistical moderation procedure…

  19. Nature's Notebook and Extension: Engaging Citizen-Scientists and 4-H Youth to Observe a Changing Environment

    Science.gov (United States)

    Posthumus, Erin E.; Barnett, LoriAnne; Crimmins, Theresa M.; Kish, George R.; Sheftall, Will; Stancioff, Esperanza; Warren, Peter

    2013-01-01

    Extension, with its access to long-term volunteers, has the unique ability to teach citizen scientists about the connection between climate variability and the resulting effects on plants, animals, and thus, humans. The USA National Phenology Network's Nature's Notebook on-line program provides a science learning tool for Extension's Master…

  20. El Universo a Sus Pies: Actividades y Recursos para Astronomia (Universe at Your Fingertips: An Astronomy Activity and Resource Notebook).

    Science.gov (United States)

    Fraknoi, Andrew, Ed.; Schatz, Dennis, Ed.

    The goal of this resource notebook is to provide activities selected by astronomers and classroom teachers, comprehensive resource lists and bibliographies, background material on astronomical topics, and teaching ideas from experienced astronomy educators. Activities are grouped into several major areas of study in astronomy including lunar…

  1. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  2. STYLE NOTEBOOK

    Institute of Scientific and Technical Information of China (English)

    Kiki Feng; Ron Lam; Kim Au

    2009-01-01

    <正>横跨半个世纪的时尚摄影> 01 ICP(International Center of Photography)由即日起至9月6日举办一场名为"Avedon Fashion 1944-2000"的展览,是专门针对Richard Avedon的时尚摄影作品的展览。Richard Avedon的摄影风格深受匈牙利摄影师Martin Munkacsi影响,更以独有的美式风格撼动了战后欧洲呆板的时装摄影格局。他让模特们走出摄影棚,走上街头海滩,跳跃舞动欢笑。他抓拍行动中的模特并为定格为其招牌的"Avedon Blur",让时装摄影呈现出另一番景

  3. STYLE NOTEBOOK

    Institute of Scientific and Technical Information of China (English)

    Tan; Nicola Lai

    2010-01-01

    <正>01爱丽丝梦游水世界也许爱丽丝的故事早已老掉牙,不过它却给予不少人创作灵感,除了鬼才导演蒂姆·波顿,来自俄罗斯的视觉艺术家Elena Kalis也是当中一员。蒂姆·波顿把这个家喻户晓的故事变得奇幻,Kalis则让她变成一场梦境。不过爱丽斯在这场梦中并没有掉进兔子洞,却来到了水世界,Kalis利用她擅长的水底摄影重新演绎了这故事,在她生活了10年的巴哈马群岛拍摄了"Alice in the Waterland"。Kalis的女儿成为她镜头下的爱丽斯,漂亮的红珊瑚礁则成为了最佳的拍摄场景。Kalis的作品总是充满童话气氛,独特的光影与鲜艳的色彩,不只带给人视觉的享受,同时更能触动观赏者的心情。这个新版本的爱丽丝故事,带我们畅游既真实又虚幻的场景之间,带出梦幻的气氛,为这个耳熟能详的故事增添了新意义。www.elenakalisphoto.com

  4. Style Notebook

    Institute of Scientific and Technical Information of China (English)

    Tan; Nicola Lai

    2010-01-01

    <正>01Linda Farrow x House Of Holland玩味无框眼镜〉怀旧眼镜潮流回归,令Linda Farrow成为大热品牌,近年她十分积极找不同时装设计单位合作,继早前Jeremy Scott、Matthew Williamson及Luella等合作企划成功后,2010年春夏季将会与Raf Simons及House Of Holland两位设计师推出全新合作系列。前者跟Linda Farrow合作无间,早有默契,至于初试啼声的Henry Holland,他的塑胶玻璃眼镜看似简约,然而配合橙、绿两款鲜艳颜色,却显现了Henry Holland一贯的玩味个性。眼镜于巴黎Colette率先发售,相信将会吸引不少年轻时装人。www.colette.fr

  5. Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud

    Science.gov (United States)

    Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.

    2016-12-01

    We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology

  6. Differences in typing forces, muscle activity, comfort, and typing performance among virtual, notebook, and desktop keyboards.

    Science.gov (United States)

    Kim, Jeong Ho; Aulck, Lovenoor; Bartha, Michael C; Harper, Christy A; Johnson, Peter W

    2014-11-01

    The present study investigated whether there were physical exposure and typing productivity differences between a virtual keyboard with no tactile feedback and two conventional keyboards where key travel and tactile feedback are provided by mechanical switches under the keys. The key size and layout were same across all the keyboards. Typing forces; finger and shoulder muscle activity; self-reported comfort; and typing productivity were measured from 19 subjects while typing on a virtual (0 mm key travel), notebook (1.8 mm key travel), and desktop keyboard (4 mm key travel). When typing on the virtual keyboard, subjects typed with less force (p's typing forces and finger muscle activity came at the expense of a 60% reduction in typing productivity (p typing sessions or when typing productivity is at a premium, conventional keyboards with tactile feedback may be more suitable interface.

  7. Recommendations for the use of notebooks in upper-division physics lab courses

    CERN Document Server

    Stanley, Jacob T

    2016-01-01

    The use of lab notebooks for scientific documentation is a ubiquitous part of physics research. However, it is common for undergraduate physics laboratory courses not to emphasize the development of these documentation skills, despite the fact that these lab courses are some of the earliest opportunities for students to start engaging in this practice. One potential impediment to the inclusion of explicit documentation training is that it may be unclear to instructors what constitutes "best practices" and how those best practices can be incorporated into the lab class environment. In this work, we outline some of the salient features of authentic documentation, informed by interviews with physics researchers, and provide recommendations for how these can be incorporated into the lab curriculum. The features of documentation that we outline do not focus on structural details or templates for what to record, but rather on holistic considerations for the purpose of scientific documentation that can guide student...

  8. Going paperless: implementing an electronic laboratory notebook in a bioanalytical laboratory.

    Science.gov (United States)

    Beato, Brian; Pisek, April; White, Jessica; Grever, Timothy; Engel, Brian; Pugh, Michael; Schneider, Michael; Carel, Barbara; Branstrator, Laurel; Shoup, Ronald

    2011-07-01

    AIT Bioscience, a bioanalytical CRO, implemented a highly configurable, Oracle-based electronic laboratory notebook (ELN) from IDBS called E-WorkBook Suite (EWBS). This ELN provides a high degree of connectivity with other databases, including Watson LIMS. Significant planning and training, along with considerable design effort and template validation for dozens of laboratory workflows were required prior to EWBS being viable for either R&D or regulated work. Once implemented, EWBS greatly reduced the need for traditional quality review upon experiment completion. Numerous real-time error checks occur automatically when conducting EWBS experiments, preventing the majority of laboratory errors by pointing them out while there is still time to correct any issues. Auditing and reviewing EWBS data are very efficient, because all data are forever securely (and even remotely) accessible, provided a reviewer has appropriate credentials. Use of EWBS significantly increases both data quality and laboratory efficiency.

  9. Wiki Laboratory Notebooks: Supporting Student Learning in Collaborative Inquiry-Based Laboratory Experiments

    Science.gov (United States)

    Lawrie, Gwendolyn Angela; Grøndahl, Lisbeth; Boman, Simon; Andrews, Trish

    2016-06-01

    Recent examples of high-impact teaching practices in the undergraduate chemistry laboratory that include course-based undergraduate research experiences and inquiry-based experiments require new approaches to assessing individual student learning outcomes. Instructors require tools and strategies that can provide them with insight into individual student contributions to collaborative group/teamwork throughout the processes of experimental design, data analysis, display and communication of their outcomes in relation to their research question(s). Traditional assessments in the form of laboratory notebooks or experimental reports provide limited insight into the processes of collaborative inquiry-based activities. A wiki environment offers a collaborative domain that can potentially support collaborative laboratory processes and scientific record keeping. In this study, the effectiveness of the wiki in supporting laboratory learning and assessment has been evaluated through analysis of the content and histories for three consenting, participating groups of students. The conversational framework has been applied to map the relationships between the instructor, tutor, students and laboratory activities. Analytics that have been applied to the wiki platform include: character counts, page views, edits, timelines and the extent and nature of the contribution by each student to the wiki. Student perceptions of both the role and the impact of the wiki on their experiences and processes have also been collected. Evidence has emerged from this study that the wiki environment has enhanced co-construction of understanding of both the experimental process and subsequent communication of outcomes and data. A number of features are identified to support success in the use of the wiki platform for laboratory notebooks.

  10. From Notebook to Novel and from Diary to Dante: Reading Robert Dessaix’s Night Letters

    Directory of Open Access Journals (Sweden)

    Roberta Trapè

    2009-06-01

    Full Text Available This paper has developed out of a larger work in progress, which focuses on representations of Italy in contemporary Australian fiction and non-fiction prose. This larger project aims to add to an established body of work on travel writing by considering Australian texts that describe Australian travel in Italy, Italian people and Italian places. In this paper, I will specifically focus on the representations of Italy in Robert Dessaix’s novel Night Letters (1996. My paper will explore the relationship between the writer’s actual journey in Italy and that of the creative work’s main character. The novel offers the protagonist’s account in the form of letters, which describe his travel from Switzerland across Northern Italy to Venice. I will begin by briefly outlining the Italian itinerary followed by Dessaix that would eventually inspire the novel. I will then explore the relationship between Dessaix’s notebooks recording his two journeys in Italy and the literary accomplishment of Night Letters. My aim is to show ways in which an itinerary becomes a story, a complex narrative. Reference will be made to factual accounts and descriptions in the author’s own diaries with an analysis of their generative role as key sources for the fictional work. This will be done through a close reading of particular passages, in the diaries and in the novel, concerning the same event. A comparative analysis of the notebooks and Night Letters can show that Dessaix’s diary entries relating to Italian places are woven into the fictional fabric of the ‘night letters’ according to a unifying principle.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  14. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  15. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for Public Data Releases and Outreach

    CERN Document Server

    Abah, Anthony

    2016-01-01

    The project worked on the development of a physics analysis and its software under ROOT framework and Jupyter notebooks for the the ATLAS Outreach and the Naples teams. This analysis is created in the context of the release of data and Monte Carlo samples by the ATLAS collaboration. The project focuses on the enhancement of the recent opendata.atlas.cern web platform to be used as educational resources for university students and new researches. The generated analysis structure and tutorials will be used to extend the participation of students from other locations around the World. We conclude the project with the creation of a complete notebook representing the so-called W analysis in C + + language for the mentioned platform.

  16. GeoNotebook: Browser based Interactive analysis and visualization workflow for very large climate and geospatial datasets

    Science.gov (United States)

    Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.

    2016-12-01

    Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  18. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  1. DOC-a file system cache to support mobile computers

    Science.gov (United States)

    Huizinga, D. M.; Heflinger, K.

    1995-09-01

    This paper identifies design requirements of system-level support for mobile computing in small form-factor battery-powered portable computers and describes their implementation in DOC (Disconnected Operation Cache). DOC is a three-level client caching system designed and implemented to allow mobile clients to transition between connected, partially disconnected and fully disconnected modes of operation with minimal user involvement. Implemented for notebook computers, DOC addresses not only typical issues of mobile elements such as resource scarcity and fluctuations in service quality but also deals with the pitfalls of MS-DOS, the operating system which prevails in the commercial notebook market. Our experiments performed in the software engineering environment of AST Research indicate not only considerable performance gains for connected and partially disconnected modes of DOC, but also the successful operation of the disconnected mode.

  2. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  3. The Oriented Difference of Gaussians (ODOG) model of brightness perception: Overview and executable Mathematica notebooks.

    Science.gov (United States)

    Blakeslee, Barbara; Cope, Davis; McCourt, Mark E

    2016-03-01

    The Oriented Difference of Gaussians (ODOG) model of brightness (perceived intensity) by Blakeslee and McCourt (Vision Research 39:4361-4377, 1999), which is based on linear spatial filtering by oriented receptive fields followed by contrast normalization, has proven highly successful in parsimoniously predicting the perceived intensity (brightness) of regions in complex visual stimuli such as White's effect, which had been believed to defy filter-based explanations. Unlike competing explanations such as anchoring theory, filling-in, edge-integration, or layer decomposition, the spatial filtering approach embodied by the ODOG model readily accounts for the often overlooked but ubiquitous gradient structure of induction which, while most striking in grating induction, also occurs within the test fields of classical simultaneous brightness contrast and the White stimulus. Also, because the ODOG model does not require defined regions of interest, it is generalizable to any stimulus, including natural images. The ODOG model has motivated other researchers to develop modified versions (LODOG and FLODOG), and has served as an important counterweight and proof of concept to constrain high-level theories which rely on less well understood or justified mechanisms such as unconscious inference, transparency, perceptual grouping, and layer decomposition. Here we provide a brief but comprehensive description of the ODOG model as it has been implemented since 1999, as well as working Mathematica (Wolfram, Inc.) notebooks which users can employ to generate ODOG model predictions for their own stimuli.

  4. Service Integration to Enhance Research Data Management: RSpace Electronic Laboratory Notebook Case Study

    Directory of Open Access Journals (Sweden)

    Stuart Macdonald

    2015-02-01

    Full Text Available Research Data Management (RDM provides a framework that supports researchers and their data throughout the course of their research and is increasingly regarded as one of the essential areas of responsible conduct of research. New tools and infrastructures make possible the generation of large volumes of digital research data in a myriad of formats. This facilitates new ways to analyse, share and reuse these outputs, with libraries, IT services and other service units within academic institutions working together with the research community to develop RDM infrastructures to curate and preserve this type of research output and make them re-usable for future generations. Working on the principle that a rationalised and continuous flow of data between systems and across institutional boundaries is one of the core goals of information management, this paper will highlight service integration via Electronic Laboratory Notebooks (ELN, which streamline research data workflows, result in efficiency gains for researchers, research administrators and other stakeholders, and ultimately enhance the RDM process.

  5. Focused campaign increases activity among participants in Nature's Notebook, a citizen science project

    Science.gov (United States)

    Crimmins, Theresa M.; Weltzin, Jake F.; Rosemartin, Alyssa H.; Surina, Echo M.; Marsh, Lee; Denny, Ellen G.

    2014-01-01

    Citizen science projects, which engage non-professional scientists in one or more stages of scientific research, have been gaining popularity; yet maintaining participants’ activity level over time remains a challenge. The objective of this study was to evaluate the potential for a short-term, focused campaign to increase participant activity in a national-scale citizen science program. The campaign that we implemented was designed to answer a compelling scientific question. We invited participants in the phenology-observing program, Nature’s Notebook, to track trees throughout the spring of 2012, to ascertain whether the season arrived as early as the anomalous spring of 2010. Consisting of a series of six electronic newsletters and costing our office slightly more than 1 week of staff resources, our effort was successful; compared with previous years, the number of observations collected in the region where the campaign was run increased by 184%, the number of participants submitting observations increased by 116%, and the number of trees registered increased by 110%. In comparison, these respective metrics grew by 25, 55, and 44%, over previous years, in the southeastern quadrant of the United States, where no such campaign was carried out. The campaign approach we describe here is a model that could be adapted by a wide variety of programs to increase engagement and thereby positively influence participant retention.

  6. Nature's Notebook Provides Phenology Observations for NASA Juniper Phenology and Pollen Transport Project

    Science.gov (United States)

    Luval, J. C.; Crimmins, T. M.; Sprigg, W. A.; Levetin, E.; Huete, A.; Nickovic, S.; Prasad, A.; Vukovic, A.; VandeWater, P. K.; Budge, A. M.; hide

    2014-01-01

    Phenology Network has been established to provide national wide observations of vegetation phenology. However, as the Network is still in the early phases of establishment and growth, the density of observers is not yet adequate to sufficiently document the phenology variability over large regions. Hence a combination of satellite data and ground observations can provide optimal information regarding juniperus spp. pollen phenology. MODIS data was to observe Juniperus supp. pollen phenology. The MODIS surface reflectance product provided information on the Juniper supp. cone formation and cone density. Ground based observational records of pollen release timing and quantities were used as verification. Approximately 10, 818 records of juniper phenology for male cone formation Juniperus ashei., J. monosperma, J. scopulorum, and J. pinchotti were reported by Nature's Notebook observers in 2013 These observations provided valuable information for the analysis of satellite images for developing the pollen concentration masks for input into the PREAM (Pollen REgional Atmospheric Model) pollen transport model. The combination of satellite data and ground observations allowed us to improve our confidence in predicting pollen release and spread, thereby improving asthma and allergy alerts.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  14. Tablet Personal Computer Integration in Higher Education: Applying the Unified Theory of Acceptance and Use Technology Model to Understand Supporting Factors

    Science.gov (United States)

    Moran, Mark; Hawkes, Mark; El Gayar, Omar

    2010-01-01

    Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…

  15. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  17. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  18. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. Computing with Mathematica

    CERN Document Server

    Hoft, Margret H

    2002-01-01

    Computing with Mathematica, 2nd edition is engaging and interactive. It is designed to teach readers how to use Mathematica efficiently for solving problems arising in fields such as mathematics, computer science, physics, and engineering. The text moves from simple to complex, often following a specific example on a number of different levels. This gradual increase incomplexity allows readers to steadily build their competence without being overwhelmed. The 2nd edition of this acclaimed book features:* An enclosed CD for Mac and Windows that contains the entire text as acollection of Mathematica notebooks* Substantive real world examples* Challenging exercises, moving from simple to complex* A collection of interactive projects from a variety of applications "I really think this is an almost perfect text." -Stephen Brick, University of South Alabama* Substantive real world examples * Challenging exercises, moving from simple to complex examples * Interactive explorations (on the included CD-ROM) from a ...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  4. On Darwin's 'metaphysical notebooks'. I: teleology and the project of a theory.

    Science.gov (United States)

    Calabi, L

    2001-01-01

    Huxley's essay On the Reception of the 'Origin of Species' brings us close to the issue of cause and of why- and how-questions in the understanding of the living world. The present contribution, which is divided into two parts, reviews the problem of Teleology as conceived by Huxley and re-examines Darwin as the author who revealed the existence of a 'foundations problem' in the explanation of an entire realm of nature, i.e., the problem of explaining such realm in terms of its own, specific legality, or iuxta sua propria principia. In the first part the enquiry is mainly focused on the secularization of natural history after Paley; in the second part it is mainly focused on the desubjectivization of the inquiry into natural history after Erasmus Darwin and Lamarck. The second part will be published in the next issue of Rivista di Biologia/Biology Forum. In the first part below an analysis is made of Notebooks M and N. The author disputes the correctness of conceiving them only as the works where Darwin envisages the 'metaphysical' themes later to become the subject of The Expression of the Emotions. He suggests to conceive of them also as the works where Darwin defines the terms of the general project of his own, peculiar evolutionary theory. The author then outlines the intellectual progress of Darwin from the inosculation to the transmutation hypotheses. Darwin's reading of Malthus appears to be analytically decisive, because it offers him the vintage point to attack the metaphysical and theological citadels on the morphological side. Darwin is thus able to re-consider Erasmus' comprehensive zoonomic project, by displacing it, however, from the old idea of the scala naturae to the new one of the "coral of life", and by emphasising the distinction between "the fittest" and "the best" vs. the tradition of Natural Theology.

  5. Web-based workflows to produce ocean climatologies using DIVA (Data-Interpolating Variational Analysis) and Jupyter notebooks

    Science.gov (United States)

    Barth, Alexander; Troupin, Charles; Watelet, Sylvain; Alvera-Azcarate, Aida; Beckers, Jean-Marie

    2017-04-01

    The analysis tool DIVA (Data-Interpolating Variational Analysis) is designed to generate gridded fields or climatologies from in situ observations. The tool DIVA minimizes a cost function to ensure that the analysed field is relatively close to the observations and conforms at the same time to a set of dynamical constraints. In particular, DIVA naturally decouples water bodies which are not directly connected and it uses a (potentially spatial varying) correlation length to describe over which length-scale the analysed variable is correlated. In addition, DIVA can also take ocean currents into account to introduce a preferential direction for the correlation. The SeaDataCloud project aims to facilitate the access and use of ocean in situ data from 45 national oceanographic data centres and marine data centres from 35 countries riparian to all European seas. A central aspect is to provide web-based virtual research environment, where scientists can easily access and explore the data sets through the SeaDataCloud infrastructure. For users familiar with programming languages like Julia and Python, Jupyter (acronym for Julia, Python and R) notebooks provide an exciting way to analyse and to interact with ocean data. Jupyter notebooks are made up of cells that can be run individually and can contain text, formulas or code fragment. A complete notebook explains how to go from input data and parameters to a result, in this case a gridded field obtained executing DIVA. This presentation discusses this new web-based workflow for generating climatologies using DIVA. It explores its new possibilities in particular, in terms of improved ease of use and reproducibility of the results. The integration in the infrastructure of EUDAT is also addressed.

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  7. Perancangan Aplikasi e-Commerce Pada Toko Pacific Computer

    OpenAIRE

    2015-01-01

    Kajian ini bertujuan untuk merancang aplikasi e-commerce pada toko Pacific Computer. Aplikasi e-commerce ini dikembangkan dengan mengguanakan perangkat lunak PHP dan MySQL. Aplikasi e-commerce ini berfungsi untuk melakukan jual beli perangkat komputer, dimana di dalamnya terdapat berbagai kategori dan merek notebook dan netbook yang di tampilkan. Tujuan perancangan aplikasi e-commerce ini adalah untuk memahami sistem penjualan dan strategi penjualan secara online untuk menin...

  8. Lessons Learned from the First Two Years of Nature's Notebook, the USA National Phenology Network's Plant and Animal Observation Program

    Science.gov (United States)

    Crimmins, T. M.; Rosemartin, A.; Denny, E. G.; Weltzin, J. F.; Marsh, L.

    2010-12-01

    Nature’s Notebook is the USA National Phenology Network’s (USA-NPN) national-scale plant and animal phenology observation program. The program was launched in March 2009 focusing only on plants; 2010 saw the addition of animals and the name and identity “Nature’s Notebook.” Over these two years, we have learned much about how to effectively recruit, train, and retain participants. We have engaged several thousand participants and can report a retention rate, reflected in the number of registered individuals that report observations, of approximately 25%. In 2009, participants reported observations on 133 species of plants on an average of nine days of the year, resulting in over 151,000 records in the USA-NPN phenology database. Results for the 2010 growing season are still being reported. Some of our most valuable lessons learned have been gleaned from communications with our observers. Through an informal survey, participants indicated that they would like to see more regular and consistent communications from USA-NPN program staff; clear, concise, and readily available training materials; mechanisms to keep them engaged and continuing to participate; and quick turn-around on data summaries. We are using this feedback to shape our program into the future. Another key observation we’ve made about our program is the value of locally and regionally-based efforts to implement Nature’s Notebook; some of our most committed observers are participating through partner programs such as the University of California-Santa Barbara Phenology Stewardship Program, Arbor Day Foundation, and the Great Sunflower Project. Future plans include reaching out to more partner organizations and improving our support for locally-based implementations of the Nature’s Notebook program. We have also recognized that the means for reaching and retaining potential participants in Nature’s Notebook vary greatly across generations. As the majority of our participants to

  9. Researcher-driven Campaigns Engage Nature's Notebook Participants in Scientific Data Collection

    Science.gov (United States)

    Crimmins, Theresa M.; Elmore, Andrew J.; Huete, Alfredo; Keller, Stephen; Levetin, Estelle; Luvall, Jeffrey; Meyers, Orrin; Stylinski, Cathlyn D.; Van De Water, Peter K.; Vukovic, Ana

    2013-01-01

    One of the many benefits of citizen science projects is the capacity they hold for facilitating data collection on a grand scale and thereby enabling scientists to answer questions they would otherwise not been able to address. Nature's Notebook, the plant and animal phenology observing program of the USA National Phenology Network (USA-NPN) suitable for scientists and non-scientists alike, offers scientifically-vetted data collection protocols and infrastructure and mechanisms to quickly reach out to hundreds to thousands of potential contributors. The USA-NPN has recently partnered with several research teams to engage participants in contributing to specific studies. In one example, a team of scientists from NASA, the New Mexico Department of Health, and universities in Arizona, New Mexico, Oklahoma, and California are using juniper phenology observations submitted by Nature's Notebookparticipants to improve predictions of pollen release and inform asthma and allergy alerts. In a second effort, researchers from the University of Maryland Center for Environmental Science are engaging Nature's Notebookparticipants in tracking leafing phenophases of poplars across the U.S. These observations will be compared to information acquired via satellite imagery and used to determine geographic areas where the tree species are most and least adapted to predicted climate change. Researchers in these partnerships receive benefits primarily in the form of ground observations. Launched in 2010, the juniper pollen effort has engaged participants in several western states and has yielded thousands of observations that can play a role in model ground validation. Periodic evaluation of these observations has prompted the team to improve and enhance the materials that participants receive, in an effort to boost data quality. The poplar project is formally launching in spring of 2013 and will run for three years; preliminary findings from 2013 will be presented. Participants in these

  10. Cuadernos de Autoformacion en Participacion Social: Normatividad. Volumen 5. Primera Edicion (Self-Instructional Notebooks on Social Participation: Legal Issues. Volume 5. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  11. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  12. Cuadernos de Autoformacion en Participacion Social: Proyectos del INEA. Volumen 3. Primera Edicion (Self-Instructional Notebooks on Social Participation: INEA Projects. Volume 3. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  13. Cuadernos de Autoformacion en Participacion Social. Principios y Valores. Volumen 1 (Self Instructional Notebooks on Social Participation. Principles and Values. Volume 1).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  14. Cuadernos de Autoformacion en Participacion Social. Para que y para quienes. Primera Edicion (Self-Informational Notebooks on Social Participation. For What and for Whom)? First Edition.

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  15. Cuadernos de Autoformacion en Participacion Social: Orientaciones Practicas. Volumen 4. Primera Edicion (Self-Instructional Notebooks on Social Participation: Practical Orientations. Volume 4. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  16. Cuaderno de Proyectos Manuales Sugeridos para Estudiantes con Impedimentos (Notebook of Manual Projects Suggested for Students with Light and Moderate Disabilities).

    Science.gov (United States)

    Puerto Rico State Dept. of Education, Hato Rey. Office of Special Education.

    This notebook is a reference source that lists suggested manual projects that can be completed by students with disabilities. The projects were developed for use with special education students and those in prevocational education. Information is included about materials, hardware, equipment, and safety, with recommendations for each project and…

  17. A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project

    NARCIS (Netherlands)

    Hare, M.; Bers, van C.; Mysiak, J.; Calliari, E.; Haque, A.; Warner, K.; Yuzva, K.; Zissener, M.; Jaspers, A.M.J.; Timmerman, J.G.

    2014-01-01

    This publication, A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project is one of two main CATALYST knowledge products that focus on the transformative approaches and measures that can support Disa

  18. A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project

    NARCIS (Netherlands)

    Hare, M.; Bers, van C.; Mysiak, J.; Calliari, E.; Haque, A.; Warner, K.; Yuzva, K.; Zissener, M.; Jaspers, A.M.J.; Timmerman, J.G.

    2014-01-01

    This publication, A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project is one of two main CATALYST knowledge products that focus on the transformative approaches and measures that can support Disa

  19. N. S. LESKOV’S NOTEBOOK WITH EXTRACTS FROM “PROLOGUE” (THE EXPERIENCE OF TEXTUAL COMMENTS

    Directory of Open Access Journals (Sweden)

    Inna N. Mineeva

    2016-03-01

    Full Text Available Thе article, for the fi rst time, provides a detailed textual commentary on N. S. Leskov’s notebook with extracts from “Prologue”. The extant literary materials include extracts and abstracts from the early printed Prologue, fi ction and historical literature of the 19th century, letters of European and Russian scholars and authors (Pushkin A., Tolstoy L., Pigault-Lebrun, Sher I., devoted to  doctrine matters and religious aspects, description and analysis of anthropologic categories. The autograph is the evidence of spiritual search and creative experiments of  the writer. In  the books the  writer found endorsement of both his own ideas, and those ones that require further inner understanding, questioning and emotional upheaval. Meanwhile, studying the history, structure and contents of  Prologue in  the 1880s, Leskov found an  exceptional existential and creative experience. The most part of  the notebook shows the writer’s learning process of various examples of repentance, atonement, a sudden rebirth of a sinner, active love, the benefi ts of obedience, the miracle of movement of a saint in space, the phenomenon of manifestation of  supernatural power and its intervention in  life of  a  man (God, the Holy Spirit, Angels, etc. While working with Prologue texts Leskov enunciated some principles of  their artistic processing  (quoting “crisis”, “turning”, unusual fragments in  the Church Slavonic language, emphasizing key situations by changing the name, specifying the narration, acronyms, graphic intonation. General trends in  understanding of  the  Prologue source (ideological, imaginative, plot-compositional, stylistic, identifi ed in the notebook, are subsequently transformed by the author in a series of “Byzantine Legends” where they receive additional semantic and functional load.

  20. Teaching compound words to a spelling-disabled child via Smart Notebook Technology: Α case study approach

    Directory of Open Access Journals (Sweden)

    Styliani N. Tsesmeli

    2015-10-01

    Full Text Available The case-study aims to examine the effectiveness of training of morphological structure on the spelling of compounds by a spelling-disabled primary school student. The experimental design of the intervention was based on the word-pair paradigm and included a pre-test, a training program and a post-test (n= 50 pairs. The Training Program aimed to offer systematic, targeted and step-by-step instruction of morphological decomposition of words to the student and delivered via the Smart Notebook educational software. The intervention had a substantial impact in enhancing the spelling of compounds by the individual. Especially, instructional gains were statistically significant, and generalized substantially to untrained but analogous words and pseudowords in terms of structure and common stems. These findings are particularly important for the development of alternative approaches to the educational interventions of individuals with spelling difficulties and developmental dyslexia, and are consistent with the experimental literature.

  1. Making sense of monitoring data using Jupyter Notebooks: a case study of dissolved oxygen dynamics across a fresh-estuarine gradient

    Science.gov (United States)

    Nelson, N.; Munoz-Carpena, R.

    2016-12-01

    In the presented exercise, students (advanced undergraduate-graduate) explore dissolved oxygen (DO) dynamics at three locations along a fresh-estuarine gradient of the Lower St. Johns River, FL (USA). Spatiotemporal DO trends along this gradient vary as a function of (1) tidal influence, and (2) biotic productivity (phytoplankton photosynthesis and community respiration). This combination of influences produces distinct DO behavior across each of the three hydrologically-connected sites. Through analysis of high frequency monitoring data, students are encouraged to think critically about the roles of physical and biological drivers of DO, and how the relative importance of these factors can vary among different locations within a single tidal waterbody. Data from each of the three locations along the river are downloaded with CUAHSI HydroClient, and analysis is performed with a Python-enabled Jupyter Notebook that has been specifically created for this assignment. Jupyter Notebooks include annotated code organized into blocks that are executed one-at-a-time; this format is amenable to classroom teaching, and provides an approachable introduction to Python for inexperienced coders. The outputs from each code block (i.e. graphs, tables) are produced within the Jupyter Notebook, thus allowing students to directly interact with the code. Expected student learning outcomes include increased spatial reasoning, as well as greater understanding of DO cycling, spatiotemporal variability in tidal systems, and challenges associated with collecting and evaluating large data sets. Specific technical learning outcomes include coding in Python for data management and analysis using Jupyter notebooks. This assignment and associated materials are open-access and available on the Science Education Resource Center website.

  2. 《金色笔记》的后殖民主义解读%Postcolonialism of The Golden Notebook Interpretation

    Institute of Scientific and Technical Information of China (English)

    梁静

    2014-01-01

    The Golden Notebook,written by Doris Lessing ,is multiple in structure and rich in content .The black notebook ,one of five notebooks in The Golden Notebook,described three kinds of people:the Euro-pean white ,the white in the colony and the native black ,who represented different cultural identities and lived in Africa which was called "in between"coined by Homi Bhabha ,and this situation lead to their complication and hybridity relationships which indicated post-colonial phenomena in the novel , such as self and other ,cultural identity and hybridity .%对多丽丝·莱辛的《金色笔记》进行了分析,它描绘了欧洲白人、殖民地白人(克里奥尔人)和殖民地黑人在非洲的生活,就彼此不同的文化身份而产生的对抗性展开了叙述,但由于同处于一个间质空间---非洲,他们的生活必然发生交集,因此他们的关系不但有对抗性,而且呈现出了混杂性。通过描写白人与黑人、白人与克里奥尔人之间的矛盾与混杂的复杂关系,体现出后殖民主义批评理论中的自我与他者、文化身份和混杂性理论。

  3. Magnetic resonance: Using computer simulations and visualizations to connect quantum theory with classical concepts

    Science.gov (United States)

    Engelhardt, Larry

    2015-12-01

    We discuss how computers can be used to solve the ordinary differential equations that provide a quantum mechanical description of magnetic resonance. By varying the parameters in these equations and visually exploring how these parameters affect the results, students can quickly gain insights into the nature of magnetic resonance that go beyond the standard presentation found in quantum mechanics textbooks. The results were generated using an IPython notebook, which we provide as an online supplement with interactive plots and animations.

  4. Computational seismology a practical introduction

    CERN Document Server

    Igel, Heiner

    2016-01-01

    This volume is an introductory text to a range of numerical methods used today to simulate time-dependent processes in Earth science, physics, engineering, and many other fields. The physical problem of elastic wave propagation in 1D serves as a model system with which the various numerical methods are introduced and compared. The theoretical background is presented with substantial graphical material supporting the concepts. The results can be reproduced with the supplementary electronic material provided as Python codes embedded in Jupyter notebooks. The volume starts with a primer on the physics of elastic wave propagation, and a chapter on the fundamentals of parallel programming, computational grids, mesh generation, and hardware models. The core of the volume is the presentation of numerical solutions of the wave equation with six different methods: (1) the finite-difference method; (2) the pseudospectral method (Fourier and Chebyshev); (3) the linear finite-element method; (4) the spectral-element meth...

  5. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    Science.gov (United States)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  6. ROLE OF COMPUTER ORIENTED LABORATORY TRAINING COURSE IN PHYSICS FOR DEVELOPMENT OF KEY COMPETENCES OF FUTURE ENGINEERS

    Directory of Open Access Journals (Sweden)

    Iryna Slipukhina

    2014-06-01

    Full Text Available In the article the features of the core competencies, which are formed in the course study of Physics at the Technical University are described. Some features and examples of the use of computer-oriented laboratory work for the formation of technological competencies engineering students are highlighted. Definitely possible elements of interactive content notebook integrated with software analysis of the experimental data.

  7. Synchronizing files or images among several computers or removable devices. A utility to avoid frequent back-ups.

    Science.gov (United States)

    Leonardi, Rosalia; Maiorana, Francesco; Giordano, Daniela

    2008-06-01

    Many of us use and maintain files on more than 1 computer--a desktop part of the time, and a notebook, a palmtop, or removable devices at other times. It can be easy to forget which device contains the latest version of a particular file, and time-consuming searches often ensue. One way to solve this problem is to use software that synchronizes the files. This allows users to maintain updated versions of the same file in several locations.

  8. Uma análise dos atributos importantes no processo de decisão de compra de notebooks utilizando análise fatorial e escalonamento multidimensional.

    Directory of Open Access Journals (Sweden)

    Valter Afonso Vieira

    2006-12-01

    Full Text Available Identificar atributos importantes no processo decisório do consumidor é uma tarefa árdua para profissionais de marketing. Diversos são os segmentos que necessitam de tais tipos de pesquisas. Com base nesse contexto, este artigo tem como objetivo identificar os atributos importantes considerados pelos consumidores na compra de notebook. Para tal fim, realizou-se uma pesquisa exploratória-qualitativa por meio da entrevista de profundidade com profissionais da área de informática e com potenciais compradores de notebook. Os resultados, após análise de conteúdo, demonstraram 42 atributos considerados para a compra. Em um segundo momento foi realizada uma etapa quantitativa tipo survey com uma amostra bola-de-neve de 131 entrevistados. Assim, após aplicação da análise fatorial exploratória, cinco dimensões foram identificadas, correspondendo aos atributos mais importantes para o processo de decisão de compra. As dimensões foram classificadas como prazer e benefício, características do aparelho, desempenho, cautela e operacional. Por fim, conclusões finais e pesquisas futuras são apresentadas e discutidas.

  9. A N-D VIRTUAL NOTEBOOK ABOUT THE BASILICA OF S. AMBROGIO IN MILAN: INFORMATION MODELING FOR THE COMMUNICATION OF HISTORICAL PHASES SUBTRACTION PROCESS

    Directory of Open Access Journals (Sweden)

    C. Stanga

    2017-08-01

    Full Text Available This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  10. [The role of the pharmacist to an inpatient in order to switch from inpatient treatment to a home-based care of an outpatient--the usefulness of a patient record notebook in cancer chemotherapy].

    Science.gov (United States)

    Andoh, Naomi; Katoh, Eiko; Katoh, Junichiroh; Kikuno, Fumitoyo; Sakuyama, Toshikazu; Uno, Shinji; Hirano, Akio; Inoue, Daisuke; Kobayashi, Tadashi; Mouri, Junichi; Aiba, Keisuke

    2006-12-01

    We have been successfully using a patient's record notebook in home-based outpatient cancer chemotherapy since 2003. Many of the patients expressed their satisfaction carrying a patient record notebook through our questionnaires designed to illicit details of their side effects during the chemotherapy. There are so many tasks the patient has to do by his own once he leaves the hospital and to become an outpatient. One of the important tasks the patient has to do is how to take care of the side effect by himself. In fact, some of the patients had a difficulty in evaluating their own side effect symptoms. In evaluating the side effect of patients by a pharmacist, he should not rely on the patient record notebook alone, but careful attention has to be paid to a patient's general condition by our medical team members consisting of inpatient pharmacists, surgeons, chemotherapists, palliative care physicians, nurses, social workers and others. In order to proceed with the safety of chemotherapy, it is critical to have a consensus based on medical policies concerning the reduction of side effects and to support the fight against cancer with the medical team members. The results also suggest that the patient record notebook is more useful for pharmacists in controlling of side effects and to adopt a prudent policy for chemotherapy.

  11. Cuadernos de Autoformacion en Participacion Social: Educacion con la comunidad. Volumen 6. Primera Edicion (Self-Instructional Notebooks on Social Participation: Education with the Community. Volume 6. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  12. Classroom Teacher's Idea Notebook.

    Science.gov (United States)

    Campanella, Alfred J., Ed.

    1989-01-01

    "Using Geography to Teach Comparative Values: Japan and the United States" (H. Flater) uses baseball to introduce junior high students to basic differences in the way various cultures view similar events. "Defining Directionality: Early Learning Experiences that Teach Concepts in a Pragmatic and Engaging Format" (C. Grace ; M.…

  13. The Kourovka notebook

    CERN Document Server

    Johnson, D L

    1983-01-01

    To form an up-to-date picture of what is going on in a given area of mathematics, we usually consult a shelf of current periodicals or, to save time, the appropriate section of a reviewing journal. Thus we learn of new advances in the area, which problems have been solved, what progress has been made with others, while rarely, and then only in the context of the author's own results, do we learn which problems the author failed to solve but considers interesting. In all this, a summary of current problems has no less a place in the development of a subject than a list of achievements, though t

  14. 论《金色笔记》中的女性异化%A Study of Female Alienation in The Golden Notebook

    Institute of Scientific and Technical Information of China (English)

    张英雪; 刘秀玲

    2013-01-01

    The capitalist system is essentially a patriarchal system, and female alienation is an inevitable outcome of capitalist patriarchy. In capitalism, women are alienated into three identities: sexual partner, mother and wife. The Golden Notebook is Doris Lessing's second novel that mainly records the life experiences of Anna, the heroine, whose mental collapsing is a final result of female alienating in capitalism. The Golden Notebook reveals fragmentation of the whole social groups including women. In general, female alienation is reflected in three aspects:sexual experience, motherhood, and intelligence, studies of which will be beneficial for women's final liberation.%资本主义制度本质上是父权制度,女性异化是资本主义父权制的必然产物,女性在资本主义社会主要被异化成三种身份:性伴侣、母亲和妻子。《金色笔记》是多丽丝·莱辛的第二部小说,主要记录了女主人公安娜的生活经验,揭示了包括女性在内的整个社会群体的分裂。安娜的精神混乱状态是资本主义社会女性异化的最终结果,这种异化具体体现在性体验、母职和精神智力三个层面,对这三个层面的深入分析有助于探索女性的最终解放途径。

  15. Energy technology personal computer uses 1993. Einsatzmoeglichkeiten des PC in der Energietechnik '93; Vortraege

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    This VDI report discusses the various energy technology personal computer uses and outlines future developments. The trend of decentralization accounts for the fact that personal computers are found increasingly in smaller systems and in larger subsystems. Industrial personal computers are as available and safe today as process computers. Various data base functions and a high graphical resolution make notebooks the constant, handy companions of field engineers. Comfortable user interfaces improve the control and management of plants and help to analyze disturbances without requiring programming knowledge. Standard personal computer programs replace expensive analysis programs. This VDI report addresses users from the industry and from the utilities, production engineers, licensing authorities, manufacturers, and the universities. (orig.)

  16. Academic writing in a corpus of 4th grade science notebooks: An analysis of student language use and adult expectations of the genres of school science

    Science.gov (United States)

    Esquinca, Alberto

    This is a study of language use in the context of an inquiry-based science curriculum in which conceptual understanding ratings are used split texts into groups of "successful" and "unsuccessful" texts. "Successful" texts could include known features of science language. 420 texts generated by students in 14 classrooms from three school districts, culled from a prior study on the effectiveness of science notebooks to assess understanding, in addition to the aforementioned ratings are the data sources. In science notebooks, students write in the process of learning (here, a unit on electricity). The analytical framework is systemic functional linguistics (Halliday and Matthiessen, 2004; Eggins, 2004), specifically the concepts of genre, register and nominalization. Genre classification involves an analysis of the purpose and register features in the text (Schleppegrell, 2004). The use of features of the scientific academic register, namely the use relational processes and nominalization (Halliday and Martin, 1993), requires transitivity analysis and noun analysis. Transitivity analysis, consisting of the identification of the process type, is conducted on 4737 ranking clauses. A manual count of each noun used in the corpus allows for a typology of nouns. Four school science genres, procedures, procedural recounts reports and explanations, are found. Most texts (85.4%) are factual, and 14.1% are classified as explanations, the analytical genre. Logistic regression analysis indicates that there is no significant probability that the texts classified as explanation are placed in the group of "successful" texts. In addition, material process clauses predominate in the corpus, followed by relational process clauses. Results of a logistic regression analysis indicate that there is a significant probability (Chi square = 15.23, p texts with a high rate of relational processes are placed in the group of "successful" texts. In addition, 59.5% of 6511 nouns are references to

  17. Chalk and computers

    DEFF Research Database (Denmark)

    Rasmussen, Lisa Rosén

    Since 1970 school books have first been supplemented by photocopies and later PDF files and the use of Internet sites. Chalkboards have been replaced by Smart Boards and notebooks by laptops and IPADS. Digital media has made its way into the classroom and into everyday school life. This has been ...

  18. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  19. Science Notebooks for the 21st Century. Going Digital Provides Opportunities to Learn "with" Technology Rather than "from" Technology

    Science.gov (United States)

    Fulton, Lori; Paek, Seungoh; Taoka, Mari

    2017-01-01

    Students of today are digital natives who for the most part come to school with experiences that may surpass those of their teachers. They use tablet computers and other devices in their personal lives and are eager to use them in the classroom. For teachers, this means they must integrate technology in ways that allow their students to learn with…

  20. Epigrafía de Clunia (Burgos en los Cuadernos de Excavación de Blas Taracena = Clunian Epigraphy in Blas Taracena’s Notebooks

    Directory of Open Access Journals (Sweden)

    Javier Del Hoyo Calleja

    2015-03-01

    Full Text Available Blas Taracena acometió diversas campañas de excavación en Clunia durante la primera mitad de la década de 1930. Sus resultados nunca han visto la luz salvo en un artículo fechado en 1946, centrado en los aspectos arquitectónicos de la casa n.º 1. Sin embargo, en los cuadernos personales que redactaba día a día, aún inéditos, dejó cumplida cuenta de los descubrimientos que se iban realizando. Además de tres inscripciones procedentes de ellos, parcialmente editadas, presentamos dos árulas inéditas conservadas en los fondos del Museo de Burgos, también fruto de los trabajos de Taracena.Blas Taracena worked in several excavations in Clunia during the first half of the 1930s. His results have never been published except one article dated in 1946 about the architectural aspects of a structure called house no. 1. However, he wrote every day a personal notebook, still unpublished, in which he detailed all the discoveries were made. Besides three inscriptions partially edited we present two unknown altas allocated nowadays in the Museum of Burgos.

  1. An inquiry-based biochemistry laboratory structure emphasizing competency in the scientific process: a guided approach with an electronic notebook format.

    Science.gov (United States)

    L Hall, Mona; Vardar-Ulu, Didem

    2014-01-01

    The laboratory setting is an exciting and gratifying place to teach because you can actively engage the students in the learning process through hands-on activities; it is a dynamic environment amenable to collaborative work, critical thinking, problem-solving and discovery. The guided inquiry-based approach described here guides the students through their laboratory work at a steady pace that encourages them to focus on quality observations, careful data collection and thought processes surrounding the chemistry involved. It motivates students to work in a collaborative manner with frequent opportunities for feedback, reflection, and modification of their ideas. Each laboratory activity has four stages to keep the students' efforts on track: pre-lab work, an in-lab discussion, in-lab work, and a post-lab assignment. Students are guided at each stage by an instructor created template that directs their learning while giving them the opportunity and flexibility to explore new information, ideas, and questions. These templates are easily transferred into an electronic journal (termed the E-notebook) and form the basic structural framework of the final lab reports the students submit electronically, via a learning management system. The guided-inquiry based approach presented here uses a single laboratory activity for undergraduate Introductory Biochemistry as an example. After implementation of this guided learning approach student surveys reported a higher level of course satisfaction and there was a statistically significant improvement in the quality of the student work. Therefore we firmly believe the described format to be highly effective in promoting student learning and engagement.

  2. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  3. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    Science.gov (United States)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  4. Analysis on Domestic Brand of Notebook Marketing Strategy:Taking College Students as an Example%国内品牌的笔记本营销策略分析--以大学生为例

    Institute of Scientific and Technical Information of China (English)

    翟天昶

    2013-01-01

    文章从笔记本市场的主要消费群体校园用户出发,进行了一系列的调研,对调查对象基本情况和数据进行了统计学分析,结合SWOT分析、营销策略组合,市场细分等分析方法,并结合国际贸易学的相关理论,对国内市场以及国内外笔记本的贸易环境进行了分析,面对校园用户的国内品牌的笔记本的营销策略提出了对策。%The paper makes a series of survey on the main consumer group of the notebook, and makes statistical analysis on the basic situation of respondents. Combined with SWOT analysis, marketing strategy combination, market segmentation, it analyzes the market of domestic and foreign notebook according to the relevant theories of international trade, and puts forward marketing strategy.

  5. The Experimentalism and Symbolism in The Golden Notebook%《金色笔记》的实验性和象征性

    Institute of Scientific and Technical Information of China (English)

    王才凤

    2015-01-01

    多丽丝.莱辛是当代英国文坛,尤其是二战后的文坛上最有影响力的作家之一,她凭借自己独特的文学创作获得了2007年诺贝尔文学奖.出版于1962年的《金色笔记》以其深刻的主题,创造性的形式独树一帜.通过这部小说,当代的读者们可以了解战后英国,乃至整个西方世界的社会现实,了解生活其间的人们的生活状态,尤其是精神状态,从而探索到人生和生命的意义的深刻主题.这部作品最大的特点就是主题与文本的实验性和象征性的紧密结合,形式直接承担了揭示主题的作用,大量象征手法的运用赋予了作品深厚的内涵.本文就将探讨该作品文本的实验性和象征性对揭示主题的作用.%Doris lessing is one of the most influential writers in the contemporary British literary world, especially after the second World War. The Golden Notebook was published in 1962, which was best known for its poignancy and creative form. By reading this book, today's readers can draw a vivid picture in their minds of the postwar British, as well as the whole western world's social reality, and get to know the living conditions of the people at that time, especially their spiritual situation, thus touch the profound topic of the meaning of human life. One of the most significant characteristics of the book is the perfect match of its profound motif with the experimentalism and symbolism of its text. This dissertation is to research the role that the text 's experimentalism and symbolism serve to unveil the motif.

  6. The AtChem On-line model and Electronic Laboratory Notebook (ELN): A free community modelling tool with provenance capture

    Science.gov (United States)

    Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.

    2010-12-01

    AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically

  7. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  8. Perl Testing A Developer's Notebook

    CERN Document Server

    Langworth, Ian

    2005-01-01

    Is there any sexier topic in software development than software testing? That is, besides game programming, 3D graphics, audio, high-performance clustering, cool websites, et cetera? Okay, so software testing is low on the list. And that's unfortunate, because good software testing can increase your productivity, improve your designs, raise your quality, ease your maintenance burdens, and help to satisfy your customers, coworkers, and managers. Perl has a strong history of automated tests. A very early release of Perl 1.0 included a comprehensive test suite, and it's only improved from th

  9. Media How-To Notebook.

    Science.gov (United States)

    Durbo, Alec

    Designed to assist public relations personnel deal effectively with print and non-print media, this booklet contains guidelines for: (1) analyzing an audience and selecting the appropriate media; (2) developing persuasion techniques; (3) writing for public relations; (4) determining newsworthy events; (5) detailed planning; (6) assessing results;…

  10. Oxford TRECVID 2006 - Notebook paper

    NARCIS (Netherlands)

    Philbin, J.; Bosch, A.; Chum, O.; Geusebroek, J.M.; Sivic, J.; Zisserman, A.

    2006-01-01

    The Oxford team participated in the high-level feature extraction and interactive search tasks. A vision only approach was used for both tasks, with no use of the text or audio information. For the high-level feature extraction task, we used two different approaches, one using sparse and one using d

  11. Banning standard cell engineering notebook

    Science.gov (United States)

    1976-01-01

    A family of standardized thick-oxide P-MOS building blocks (standard cells) is described. The information is presented in a form useful for systems designs, logic design, and the preparation of inputs to both sets of Design Automation programs for array design and analysis. A data sheet is provided for each cell and gives the cell name, the cell number, its logic symbol, Boolean equation, truth table, circuit schematic circuit composite, input-output capacitances, and revision date. The circuit type file, also given for each cell, together with the logic drawing contained on the data sheet provides all the information required to prepare input data files for the Design Automation Systems. A detailed description of the electrical design procedure is included.

  12. 不是名著的名著--作家决定论下的《恋恋笔记本》赏析%A Non-Classic Classic--The Appreciation of The Notebook under Writer Determinism Theory

    Institute of Scientific and Technical Information of China (English)

    欧阳乐

    2013-01-01

    There are many ways of interpreting a literary work, and in the context of the merging of different disciplines nowa⁃days, it is undoubtedly feasible to do it within a stylistic perspective. Can The Notebook, representative of American bestselling writer Nicholas Sparks, which has been adapted to a movie, and also a bestseller on the ranking list of New York Times, be called world classic? The main reflection of the Romantic Movement which sprang up in the 18th century on the stylistic theories is writer determinism theory. The popularness as well as some shortcomings of The Notebook can be revealed after careful analysis of it under the frame work of writer determinism theory. It can be concluded that it is indeed a non-classic classic.%  评价一部文学作品有很多种方法,在当今各学科相互融合的背景下,从文体学的角度来阐释无疑是可行的。《恋恋笔记本》,这部被改编成电影、《纽约时报》排行榜上的畅销书,美国畅销小说作家尼古拉斯·斯帕克思的代表作,是否能被称为世界文学名著呢?18世纪兴起的浪漫主义运动在文体学理论上主要反映为作家决定论。通过在作家决定论的理论框架下对其进行分析,可以得出其令人称道之处,同时指出其不足。它的确是一部不是名著的名著。

  13. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  14. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  15. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  16. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  17. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  18. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  19. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  20. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  1. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  2. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  3. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  4. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  5. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  6. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  7. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  8. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  9. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  10. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  11. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  12. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  13. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  14. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  15. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  16. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  17. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  18. Symbolic computation of the Hartree-Fock energy from a chiral EFT three-nucleon interaction at N 2LO

    Science.gov (United States)

    Gebremariam, B.; Bogner, S. K.; Duguet, T.

    2010-06-01

    We present the first of a two-part Mathematica notebook collection that implements a symbolic approach for the application of the density matrix expansion (DME) to the Hartree-Fock (HF) energy from a chiral effective field theory (EFT) three-nucleon interaction at N 2LO. The final output from the notebooks is a Skyrme-like energy density functional that provides a quasi-local approximation to the non-local HF energy. In this paper, we discuss the derivation of the HF energy and its simplification in terms of the scalar/vector-isoscalar/isovector parts of the one-body density matrix. Furthermore, a set of steps is described and illustrated on how to extend the approach to other three-nucleon interactions. Program summaryProgram title: SymbHFNNN Catalogue identifier: AEGC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 96 666 No. of bytes in distributed program, including test data, etc.: 378 083 Distribution format: tar.gz Programming language: Mathematica 7.1 Computer: Any computer running Mathematica 6.0 and later versions Operating system: Windows Xp, Linux/Unix RAM: 256 Mb Classification: 5, 17.16, 17.22 Nature of problem: The calculation of the HF energy from the chiral EFT three-nucleon interaction at N 2LO involves tremendous spin-isospin algebra. The problem is compounded by the need to eventually obtain a quasi-local approximation to the HF energy, which requires the HF energy to be expressed in terms of scalar/vector-isoscalar/isovector parts of the one-body density matrix. The Mathematica notebooks discussed in this paper solve the latter issue. Solution method: The HF energy from the chiral EFT three-nucleon interaction at N 2LO is cast into a form suitable for an automatic simplification of

  19. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  20. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  1. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  2. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  3. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  4. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  5. Doris Lessing's Notion of Novel-Writing in The Golden Notebook%从莱辛的《金色笔记》看她的小说创作理念

    Institute of Scientific and Technical Information of China (English)

    肖锦龙

    2011-01-01

    The Golden Notebook by Lessing is an encyclopedic work, which can be regarded as a novel on how to write a novel. Lessing makes a thorough repudiation of the traditional patterns of realist and modernist fiction through the protagonist Anna's imitation of realist and modernist fictional styles. Lessing also invents a new fiction-writing pattern by means of Anna's theoretical elucidation and experimental creation of a new fictional style. In short, Lessing undermines traditional notions of novelwriting and explores a new notion of novel-writing.%莱辛的《金色笔记》是一部百科全书式的作品。从小说创作理论的角度看,它完全可以当作一部思考如何写小说的小说来读。莱辛在这部作品中首先借主人公安娜对现实主义和现代主义小说形式的模拟和评论彻底否定了过去的现实主义和现代主义小说创作模式,接着借安娜对理想小说形式的理论阐发和写作实验创立了一种新型的小说创作模式。

  6. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  7. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  8. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  9. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  10. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  11. SciServer Compute brings Analysis to Big Data in the Cloud

    Science.gov (United States)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  12. AN INTERACTIVE WEB-BASED ANALYSIS FRAMEWORK FOR REMOTE SENSING CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Z. Wang

    2015-07-01

    Full Text Available Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users’ private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook

  13. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  14. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  15. Forecasting Computer Products Sales by Integrating Ensemble Empirical Mode Decomposition and Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Chi-Jie Lu

    2012-01-01

    Full Text Available A hybrid forecasting model that integrates ensemble empirical model decomposition (EEMD, and extreme learning machine (ELM for computer products sales is proposed. The EEMD is a new piece of signal processing technology. It is based on the local characteristic time scales of a signal and could decompose the complicated signal into intrinsic mode functions (IMFs. The ELM is a novel learning algorithm for single-hidden-layer feedforward networks. In our proposed approach, the initial task is to apply the EEMD method to decompose the original sales data into a number of IMFs. The hidden useful information of the original data could be discovered in those IMFs. The IMFs are then integrated with the ELM method to develop an effective forecasting model for computer products sales. Experimental results from three real computer products sales data, including hard disk, display card, and notebook, showed that the proposed hybrid sales forecasting method outperforms the four comparative models and is an effective alternative for forecasting sales of computer products.

  16. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  17. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  18. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  19. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  20. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  1. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  2. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  3. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  4. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    on Theory of Computing, pages 25-334, May 2000. [3]Tal Rabin and Michael Ben-Or. Verifiable secret sharing and multiparty protocols with honest majority (extended abstract). In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 73-85, Seattle, Washington, 15-17 May 1989.......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...

  5. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  6. 環境情報学科におけるノート型パソコンの活用と教育効果の評価

    OpenAIRE

    喜久川,政吉; 中村, 靖; 小嶋,弘行; 横田,壽; 藤本,勲

    2002-01-01

    Utilization of the information equipments in information education is becoming more important year after year. In Department of Environmental Information in HIT,we have been promoting a practical use of notebook computer,where every student holds an individual notebook computer and educaters have developed a new type lecture using notebook computer as a media-tool. This paper presents,(1) The actual state of students concerning computer-system.(2) Form of use of notebook computer for informat...

  7. Development of a research prototype computer `Wearables` that one can wear on his or her body; Minitsukeru computer `Wearables` kenkyuyo shisakuki wo kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    Development has been made on a prototype of a wearable computer `Wearables` that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company`s portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The `wearable computer` aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the `wearable computer` as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  8. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  9. A Novel Design of Direct Methanol Fuel Cell for Notebook Computer%一种新型笔记本电脑用直接甲醇燃料电池的设计

    Institute of Scientific and Technical Information of China (English)

    王建萍

    2010-01-01

    简述一种新型笔记本电脑用直接甲醇燃料电池(Direct Methanol Fuel Cell,DMFC)的设计,介绍了电池的组成及其零部件的结构特点.该设计具有结构简单、加注燃料方便、排气功能好等优点.

  10. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  11. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  12. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  13. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  14. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  15. A-4 size all-in-one notebook PC DynaBook 2650; A4 all in one note PC Dynabook 2650

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    The computer comes with a mobile Intel(reg sign) Celeron{sub TM} processor (466MHz) which is the CPU and a 64MB SDRAM (Synchronous Dynamic Random Access Memory) which is the main memory. It is an all-in-one computer equipped with a 14.1 type TFT-LCD (Thin Film Transistor type Liquid Crystal Display) capable of XGA (1024 times 768 dots) resolution, and with a built-in 12GB HDD, quad-speed DVD-ROM drive (24-speed CD-ROM drive in some models), and a floppy disk drive (FDD). Also provided are an Internet button for one-touch connection with the Internet and a CD button for music CD playback with the display closed down, these two for improvement on usability. It is loaded with a number of software programs, such as Office 2000 Personal, postcard layout application, Internet/e-mail software, and multimedia programs for pictorial data editing, MIDI (Musical Instruments Digital Interface) playback, etc. (translated by NEDO)

  16. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  17. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  18. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  19. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  20. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  1. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  2. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  3. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  4. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  5. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  6. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  7. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  8. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  9. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  10. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  11. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  12. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  13. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  14. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  15. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  16. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  17. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  18. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...... set of skills rather than one single skill. Skills acquisition at these layers can be tailored to the specific needs of students. The work presented here builds upon experience from courses for such students from the Humanities in which programming is taught as a tool for other purposes. Results...

  20. NEW SCIENCE OF LEARNING: COGNITION, COMPUTERS AND COLLABORATION IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Reviewed by Onur DONMEZ

    2011-01-01

    Full Text Available Information and Communication Technologies (ICTs have pervaded and changed much of our lives both on individual and societal scales. PCs, notebooks, tablets, cell phones, RSS feeds, emails, podcasts, tweets, social networks are all technologies we are familiar with and we are intensively using them in our daily lives. It is safe to say that our lives are becoming more and more digitized day by day.We have already invented bunch of terms to refer effects of these technologies on our lives. Digital nomads, grasshopper minds, millennium learners, digital natives, information age, knowledge building, knowledge society, network society are all terms invented to refer societal changes motivated by ICTs. New opportunities provided by ICTs are also shaping skill and quality demands of the next age. Individuals have to match these qualities if they want to earn their rightful places in tomorrow‘s world. Education is of course the sole light to guide them in their transformation to tomorrow‘s individual. One question arises however: ―are today‘s educational paradigms and practices ready to confront such a challenge?‖ There is a coherent and strong opinion among educators that the answer is ―NO‖. ―Today‘s students think and process information fundamentally differently from their predecessors‖(Prensky, 2001. And education has to keep pace with these students and their needs. But how? Khine & Saleh managed to gather distinguished colleagues around this question within their book titled ―New Science of Learning: Cognition, Computers and Collaboration‖. The book is composed of 29 chapters within three major topics which are: cognition, computers and collaboration.

  1. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  2. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  3. Computational Physics.

    Science.gov (United States)

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  4. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  5. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  6. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  7. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  8. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  9. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  10. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  11. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  12. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  13. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  14. Um Rio para estudante ver: engenhosidades na produção de cadernos escolares - A Rio de Janeiro to be seen by students: ingenuity in producing school notebooks

    Directory of Open Access Journals (Sweden)

    Ana Chrystina Venancio Mignot, Roberta Lopes da Veiga

    2011-03-01

    Full Text Available Resumo Analisar as intenções que guiaram a produção e comercialização da Coleção Rio, editada pela Casa Cruz, para comemorar 110 anos de existência da papelaria, em parceria com a Tilibra, a maior fabricante de cadernos escolares do país, implica em discutir a expansão e desenvolvimento da indústria caderneira resultante da modernização do parque gráfico. Para tanto, assim como editais e matérias publicadas na imprensa sobre o concurso de pintura que deu origem à coleção, alguns impressos dirigidos aos comerciantes de artigos escolares, revista e catálogos de diversas indústrias do ramo, são as principais fontes de pesquisa, visto que permitem compreender tanto as concepções que têm dos estudantes, como as preocupações que informam e conformam a produção de cadernos escolares, que deixam de ser vistos como simples suportes da escrita escolar, para serem transformados em objeto de desejo do consumidor. As escolhas das imagens das capas dos cadernos fazem parte das estratégias para conquistar este consumidor privilegiado: artistas de novelas, cantores famosos, desenhos animados, personagens de filmes e jogadores de futebol, que agradem à maioria. Com a Coleção Rio, a Casa Cruz se sobressai ao fugir da temática predominante na produção caderneira. Ao colocar em destaque algumas paisagens da Cidade Maravilhosa, também revela algumas concepções e expectativas que tem do consumidor. Nas capas assinadas por artistas plásticos cariocas, que estampam pontos turísticos e monumentos, a papelaria veicula a imagem que gostaria de perpetuar: uma cidade sem violência, medo, exclusão, uma cidade para estudante ver, amar, preservar. Palavras-chave: cadernos escolares, produção, comercialização.   A RIO DE JANEIRO TO BE SEEN BY STUDENTS: INGENUITY IN PRODUCING SCHOOL NOTEBOOKS Abstract Any attempt to analyze the intentions which have guided the production and commercialization of Rio Collection, edited by Casa Cruz

  15. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  16. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  17. Computational Physics

    Science.gov (United States)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  18. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  19. Computer files.

    Science.gov (United States)

    Malik, M

    1995-02-01

    From what has been said, several recommendations can be made for users of small personal computers regardless of which operating system they use. If your computer has a large hard disk not specially required by any single application, organize the disk into a small number of volumes. You will then be using the computer as if it had several smaller disks, which will help you to create a logical file structure. The size of individual volumes has to be selected carefully with respect to the files kept in each volume. Otherwise, it may be that you will have too much space in one volume and not enough in another. In each volume, organize the structure of directories and subdirectories logically so that they correspond to the logic of your file content. Be aware of the fact that the directories suggested as default when installing new software are often not the optimum. For instance, it is better to put different graphics packages under a common subdirectory rather than to install them at the same level as all other packages including statistics, text processors, etc. Create a special directory for each task you use the computer. Note that it is a bad practice to keep many different and logically unsorted files in the root directory of any of your volumes. Only system and important service files should be kept there. Although any file may be written all over the disk, access to it will be faster if it is written over the minimum number of cylinders. From time to time, use special programs that reorganize your files in this way.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  1. Computer Spectrometers

    Science.gov (United States)

    Dattani, Nikesh S.

    2017-06-01

    Ideally, the cataloguing of spectroscopic linelists would not demand laborious and expensive experiments. Whatever an experiment might achieve, the same information would be attainable by running a calculation on a computer. Kolos and Wolniewicz were the first to demonstrate that calculations on a computer can outperform even the most sophisticated molecular spectroscopic experiments of the time, when their 1964 calculations of the dissociation energies of H_2 and D_{2} were found to be more than 1 cm^{-1} larger than the best experiments by Gerhard Herzberg, suggesting the experiment violated a strict variational principle. As explained in his Nobel Lecture, it took 5 more years for Herzberg to perform an experiment which caught up to the accuracy of the 1964 calculations. Today, numerical solutions to the Schrödinger equation, supplemented with relativistic and higher-order quantum electrodynamics (QED) corrections can provide ro-vibrational spectra for molecules that we strongly believe to be correct, even in the absence of experimental data. Why do we believe these calculated spectra are correct if we do not have experiments against which to test them? All evidence seen so far suggests that corrections due to gravity or other forces are not needed for a computer simulated QED spectrum of ro-vibrational energy transitions to be correct at the precision of typical spectrometers. Therefore a computer-generated spectrum can be considered to be as good as one coming from a more conventional spectrometer, and this has been shown to be true not just for the H_2 energies back in 1964, but now also for several other molecules. So are we at the stage where we can launch an array of calculations, each with just the atomic number changed in the input file, to reproduce the NIST energy level databases? Not quite. But I will show that for the 6e^- molecule Li_2, we have reproduced the vibrational spacings to within 0.001 cm^{-1} of the experimental spectrum, and I will

  2. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  3. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  4. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A

    2014-01-01

    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  5. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  6. Development of a research prototype computer 'Wearables' that one can wear on his or her body. Minitsukeru computer 'Wearables' kenkyuyo shisakuki wo kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    1999-02-01

    Development has been made on a prototype of a wearable computer 'Wearables' that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company's portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The 'wearable computer' aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the 'wearable computer' as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  7. 品牌形象对消费者购买行为的影响研究--以笔记本电脑行业为例%The Influence of Brand Image on Consumer Purchase Behavior Research---On the Notebook Industry

    Institute of Scientific and Technical Information of China (English)

    庞磊; 阳晓伟

    2014-01-01

    随着时代的进步与发展,笔记本电脑的应用越来越广泛。面对多样化的品牌和丰富的功能,消费者的评价选择也各不相同。面对这种情况,本文以笔记本电脑行业为例,运用贝尔模型,通过市场调查法来收集数据,设计使用者形象、公司形象和产品形象三种统计量表。运用统计学上的方差分析、主成分分析等方法,分析和探讨品牌形象对消费者购买行为的影响。得出结论:对消费者购买行为具有显著影响的因素包括产品形象、公司形象及使用者形象三个方面,从而为笔记本电脑销售企业的营销策略提供一定参考。%For the progress and development, the notebooks are used more and more widely. In face of a variety of brands and rich features, consumer choices are not identical. On this condition, this paper takes the notebook industry as an example, using the Biel model, through the market survey to collect data, designing three statistical scales consist of user image, corporate image and product image. This paper uses the statistical analysis of variance and principal component analysis method to analyze the influence of brand image on consumer purchase behavior. The conclusion shows that there is a significant influence on consumer buying behavior including product image, corporate image and user image three factors, thus this paper has provided some marketing strategies for notebook industry.

  8. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  9. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  10. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  11. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  12. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  13. Computational micromechanics

    Science.gov (United States)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  14. Chautauqua notebook: appropriate technology on radio

    Energy Technology Data Exchange (ETDEWEB)

    Renz, B.

    1981-01-01

    Experiences in establishing and maintaining a regional call-in information-exchange radio show (Chautauqua) on energy conservation, appropriate technology, renewable energy sources, and self-reliance are discussed. Information is presented on: appropriate technology; the Chautauquaa concept; topics discussed; research performed; guests; interviewing tips; types of listeners; program features; where to find help; promotion and publicity; the technical and engineering aspects; the budget and funding; and station policies. (MCW)

  15. Notebook paper: TNO instance search submission 2011

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Eendebak, P.T.; Staalduinen, M. van; Kraaij, W.

    2011-01-01

    The TNO instance search submission to TRECVID 2011 consisted of three different runs: one is using an exhaustive keypoint search, one is using a bag-of-visual-words approach and one is using open-source face-recognition software. Our run approaches: Briefly, what approach or combination of approache

  16. Idea Notebook. Quick Activities for Every Teacher.

    Science.gov (United States)

    Ajello, Tracey S.; And Others

    1997-01-01

    Teachers' ideas for quick classroom activities include creating a garden-in-winter bulletin board, writing a science story, playing a Valentine's game, graphing vowels, averaging students' sizes, creating lifesize figures of historical people, making picture books, creating an idiom bulletin board, and sending school valentines to local hospitals.…

  17. Notebook on Nonacoustic Detection of Submarines

    Science.gov (United States)

    1980-11-12

    triangular envelope whose vert(;lX is over the submarino and which forms an angle o! approximately 39°. The amplitudes of the waves along the center line of...about to·• of this is e1nltted as sound. To a first approidmatlon, a submarino may be coneldered as a point aourou of sound whose power c.utput ranges...have been calculated 11111 re!tulllng lrom vortices from the negative ··utt" of control surfaces for a 5.S·kn submarino op&ratlng at a keel d~pth of

  18. Hanford Site Waste Storage Tank Information Notebook

    Energy Technology Data Exchange (ETDEWEB)

    Husa, E.I.; Raymond, R.E.; Welty, R.K.; Griffith, S.M.; Hanlon, B.M.; Rios, R.R.; Vermeulen, N.J.

    1993-07-01

    This report provides summary data on the radioactive waste stored in underground tanks in the 200 East and West Areas at the Hanford Site. The summary data covers each of the existing 161 Series 100 underground waste storage tanks (500,000 gallons and larger). It also contains information on the design and construction of these tanks. The information in this report is derived from existing reports that document the status of the tanks and their materials. This report also contains interior, surface photographs of each of the 54 Watch List tanks, which are those tanks identified as Priority I Hanford Site Tank Farm Safety Issues in accordance with Public Law 101-510, Section 3137*.

  19. The kids got game: Computer/video games, gender and learning outcomes in science classrooms

    Science.gov (United States)

    Anderson, Janice Lyn

    In recent years educators have begun to explore how to purposively design computer/video games to support student learning. This interest in video games has arisen in part because educational video games appear to have the potential to improve student motivation and interest in technology, and engage students in learning through the use of a familiar medium (Squire, 2005; Shaffer, 2006; Gee, 2005). The purpose of this dissertation research is to specifically address the issue of student learning through the use of educational computer/video games. Using the Quest Atlantis computer game, this study involved a mixed model research strategy that allowed for both broad understandings of classroom practices and specific analysis of outcomes through the themes that emerged from the case studies of the gendered groups using the game. Specifically, this study examined how fifth-grade students learning about science concepts, such as water quality and ecosystems, unfolds over time as they participate in the Quest Atlantis computer game. Data sources included classroom observations and video, pre- and post-written assessments, pre- and post- student content interviews, student field notebooks, field reports and the field notes of the researcher. To make sense of how students learning unfolded, video was analyzed using a framework of interaction analysis and small group interactions (Jordan & Henderson, 1995; Webb, 1995). These coded units were then examined with respect to student artifacts and assessments and patterns of learning trajectories analyzed. The analysis revealed that overall, student learning outcomes improved from pre- to post-assessments for all students. While there were no observable gendered differences with respect to the test scores and content interviews, there were gendered differences with respect to game play. Implications for game design, use of external scaffolds, games as tools for learning and gendered findings are discussed.

  20. Die Casting Mold Design of the Thin-walled Aluminum Case by Computational Solidification Simulation

    Institute of Scientific and Technical Information of China (English)

    Young-Chan Kim; Chang-Seog Kang; Jae-Ik Cho; Chang-Yeol Jeong; Se-Weon Choi; Sung-Kil Hong

    2008-01-01

    Recently, demand for the lightweight alloy in electric/electronic housings has been greatly increased. However, among the lightweight alloys, aluminum alloy thin-walled die casting is problematic because it is quite difficult to achieve sufficient fluidity and feedability to fill the thin cavity as the wall thickness becomes less than 1 mm. Therefore, in this study, thin-walled die casting of aluminum (Al-Si-Cu alloy: ALDC 12) in size of notebook computer housing and thickness of 0.8 mm was investigated by solidification simulation (MAGMA soft) and actual casting experiment (Buhler Evolution B 53D). Three different types of gating design, finger, tangential and split type with 6 vertical runners, were simulated and the results showed that sound thin-walled die casting was possible with tangential and split type gating design because those gates allowed aluminum melt to flow into the thin cavity uniformly and split type gating system was preferable gating design comparing to tangential type gating system at the point of view of soundness of casting and distortion generated after solidification. Also, the solidification simulation agreed well with the actual die-casting and the casting showed no casting defects and distortion.

  1. Use of tablet personal computers for sensitive patient-reported information.

    Science.gov (United States)

    Dupont, Alexandra; Wheeler, Jane; Herndon, James E; Coan, April; Zafar, S Yousuf; Hood, Linda; Patwardhan, Meenal; Shaw, Heather S; Lyerly, H Kim; Abernethy, Amy P

    2009-01-01

    Notebook-style computers (e/Tablets) are increasingly replacing paper methods for collecting patient-reported information. Discrepancies in data between these methods have been found in oncology for sexuality-related questions. A study was performed to formulate hypotheses regarding causes for discrepant responses and to analyze whether electronic data collection adds value over paper-based methods when collecting data on sensitive topics. A total of 56 breast cancer patients visiting Duke Breast Clinic (North Carolina) participated by responding to 12 subscales of 5 survey instruments in electronic (e/Tablet) format and to a paper version of 1 of these surveys, at each visit. Twenty-one participants (38%) provided dissimilar responses on paper and electronic surveys to one item of the Functional Assessment of Cancer Therapy-General (FACT-G) Social Well-Being scale that asked patients to rate their satisfaction with their current sex life. Among these 21 patients were 8 patients who answered the question in the electronic environment, and 13 patients who answered both paper and electronic versions but with different responses. Eleven patients (29%) did not respond to the item on either e/Tablet or paper; 45 patients (80%) answered it on e/Tablet; and 37 patients (66%) responded on the paper version. The e/Tablet electronic system may provide a "safer" environment than paper questionnaires for cancer patients to answer private or highly personal questions on sensitive topics such as sexuality.

  2. Experimental DNA computing

    NARCIS (Netherlands)

    Henkel, Christiaan

    2005-01-01

    Because of their information storing and processing capabilities, nucleic acids are interesting building blocks for molecular scale computers. Potential applications of such DNA computers range from massively parallel computation to computational gene therapy. In this thesis, several implementations

  3. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  4. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  5. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  6. Kayseri Ve Yöresi Tarih Araştırmaları Merkezi’nde Bulunan Ebrû Kaplı Defterler The Marbled Paper In The Notebooks That Is Being At History Of Kayseri And Neighboring Areas Research And Implementation Center

    Directory of Open Access Journals (Sweden)

    Neval SAKİN

    2013-09-01

    Full Text Available Manuscript libraries, museums and archives own rich collectionsof literary works decorated with traditional art such as bindings,calligraphies, ornamentations, miniatures and Marbled Paper.One of the archives that include samples of history of MarbledPaper, which is one of our traditional arts, is History of Kayseri andNeighboring Areas Research and Implementation Center (KAYTAM.Manuscripts of various topics and official documents that are in thearchives, museums and libraries are decorated with the sense of art ofthe period both to ornament and to protect. Accordingly, in our study,among notebooks of various topics that are found in the archive ofHistory of Kayseri and Neighboring Areas Research and ImplementationCentre, and included the last 100 years of Ottoman Empire and theearly years of the Turkish Republic, the ones covered with MarbledPaper were studied. As a result of the study, it is found out that out of153 notebooks with Marbled Paper, 25 were extremely ruined. In theremaining 128 notebooks covered with Marbled Paper, two differentkinds of Marbled were detected, 26 being Battal Marbled and 102 beingWavy (Dalgalı Marbled Papers.In our study, only 20 Marbled Paper samples were chosen aftereliminating the other Marbling samples that had similar patterns. Outof 20 notebooks with Marbling, 10 notebooks had Battal Marbled Paperand the remaining had Wavy Marbled Paper. The colours used in theseMarbling samples are Lahore blue, indigo, yellow oxide, red oxide, darkbrown, brown and maroon. Black, blue, light blue, green oxide and greycolours are seen in one or two samples. Even though backdatingMarbled Paper is generally difficult, the notebooks that included theMarbling samples we studied cover the dates between 1842-1876 and1886-1889. Yazma eser kütüphaneleri, müzeler, arşivler; cilt, hat, tezhip, minyatür ve ebrû gibi geleneksel sanatlarımızla bezenmiş eserlerin yer aldığı zengin koleksiyonlara sahiptir.Geleneksel sanatlar

  7. Computing with functionals—computability theory or computer science?

    OpenAIRE

    Normann, Dag

    2006-01-01

    We review some of the history of the computability theory of functionals of higher types, and we will demonstrate how contributions from logic and theoretical computer science have shaped this still active subject.

  8. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  9. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  10. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  11. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  12. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  13. Computational thinking and thinking about computing.

    Science.gov (United States)

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  14. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  15. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  16. Cloud Computing (4)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ 8 Case Study Cloud computing is still a new phenomenon. Although many IT giants are developing their own cloud computing infrastructures,platforms, software, and services, few have really succeeded in becoming cloud computing providers.

  17. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  18. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    Science.gov (United States)

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  19. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  20. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  1. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  2. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special ... the Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a CT ...

  3. O caderno de uma professora-aluna e as propostas para o ensino da aritmética na escola ativa (Minas Gerais, década de 1930 - A teacher’s notebook and the proposals for teaching arithmetic in active school (Minas Gerais, 1930

    Directory of Open Access Journals (Sweden)

    Diogo Alves de Faria Reis

    2014-01-01

    Full Text Available O artigo versa sobre o caderno de Metodologia da Aritmética de Imene Guimarães, aluna da professora Alda Lodi (1898-2002 na segunda turma da Escola de Aperfeiçoamento de Minas Gerais. Alda Lodi participou do grupo de docentes enviadas pelo governo mineiro ao Teacher’s College, nos Estados Unidos, para se prepararem para atuar na formação de professoras primárias em exercício no contexto das reformas educacionais de 1927-1928. Considerando a relevância, as potencialidades e os limites dos cadernos escolares como fonte, os registros desse caderno de 1932 são estudados e cotejados com outros materiais, em busca de uma compreensão inicial dos modos de apropriação das propostas para o ensino da aritmética no momento da adesão ao ideário da escola ativaem Minas Gerais.Palavras-chave: cadernos escolares, metodologia da aritmética, Escola de Aperfeiçoamento de Minas Gerais, Alda Lodi, história da educação matemática brasileira.A TEACHER’S NOTEBOOK AND THE PROPOSALS FOR TEACHING ARITHMETIC IN ACTIVE SCHOOL (MINAS GERAIS, 1930AbstractThe article focuses on a notebook which belonged to Imene Guimarães, a student of professor Alda Lodi (1898-2002 in Escola de Aperfeiçoamento, an institution of continuing education for teachers created by educational reforms promoted by the government of the state of Minas Gerais in 1927-1928. Alda Lodi taught Methodology of Arithmetic in this institution. Considering the relevance, potentialities and limitations of school notebooks as a source for the history of education, the records of this notebook of 1932 are studied and compared with other materials for the purpose of an initial understanding of the modes of appropriation of proposals for renovating the teaching of arithmetic according to the ideas associated to active school in Minas Gerais.Keywords: school notebooks, methodology of arithmetic, Escola de Aperfeiçoamento de Minas Gerais, Alda Lodi, history of mathematics education in

  4. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  5. Introduction to computers

    OpenAIRE

    Rajaraman, A

    1995-01-01

    An article on computer application for knowledge processing intended to generate awareness among librarians on the possiblities offered by ICT to improve services. Compares computers and the human brain, provides a historical perspective of the development of computer technology, explains the components of the computer and the computer languages, identifes the areas where computers can be applied and its benefits. Explains available storage systems and database management process. Points out ...

  6. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  7. Cloud Computing (1)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series will discuss cloud computing technology in the following aspects: The first part provides a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  8. Cloud Computing (2)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series discusses cloud computing technology in the following aspects: The first part provided a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  9. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  10. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  11. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  12. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  13. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  14. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    Science.gov (United States)

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally

  15. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  16. Development traumatic brain injury computer user interface for disaster area in Indonesia supported by emergency broadband access network.

    Science.gov (United States)

    Sutiono, Agung Budi; Suwa, Hirohiko; Ohta, Toshizumi; Arifin, Muh Zafrullah; Kitamura, Yohei; Yoshida, Kazunari; Merdika, Daduk; Qiantori, Andri; Iskandar

    2012-12-01

    Disasters bring consequences of negative impacts on the environment and human life. One of the common cause of critical condition is traumatic brain injury (TBI), namely, epidural (EDH) and subdural hematoma (SDH), due to downfall hard things during earthquake. We proposed and analyzed the user response, namely neurosurgeon, general doctor/surgeon and nurse when they interacted with TBI computer interface. The communication systems was supported by TBI web based applications using emergency broadband access network with tethered balloon and simulated in the field trial to evaluate the coverage area. The interface consisted of demography data and multi tabs for anamnesis, treatment, follow up and teleconference interfaces. The interface allows neurosurgeon, surgeon/general doctors and nurses to entry the EDH and SDH patient's data during referring them on the emergency simulation and evaluated based on time needs and their understanding. The average time needed was obtained after simulated by Lenovo T500 notebook using mouse; 8-10 min for neurosurgeons, 12-15 min for surgeons/general doctors and 15-19 min for nurses. By using Think Pad X201 Tablet, the time needed for entry data was 5-7 min for neurosurgeon, 7-10 min for surgeons/general doctors and 12-16 min for nurses. We observed that the time difference was depending on the computer type and user literacy qualification as well as their understanding on traumatic brain injury, particularly for the nurses. In conclusion, there are five data classification for simply TBI GUI, namely, 1) demography, 2) specific anamnesis for EDH and SDH, 3) treatment action and medicine of TBI, 4) follow up data display and 5) teleneurosurgery for streaming video consultation. The type of computer, particularly tablet PC was more convenient and faster for entry data, compare to that computer mouse touched pad. Emergency broadband access network using tethered balloon is possible to be employed to cover the communications systems in

  17. A new computational approach to cracks quantification from 2D image analysis: Application to micro-cracks description in rocks

    Science.gov (United States)

    Arena, Alessio; Delle Piane, Claudio; Sarout, Joel

    2014-05-01

    In this paper we propose a crack quantification method based on 2D image analysis. This technique is applied to a gray level Scanning Electron Microscope (SEM) images, segmented and converted in Black and White (B/W) images using the Trainable Segmentation plugin of Fiji. Resulting images are processed using a novel Matlab script composed of three different algorithms: the separation algorithm, the filtering and quantification algorithm and the orientation one. Initially the input image is enhanced via 5 morphological processes. The resulting lattice is “cut” into single cracks using 1 pixel-wide bisector lines originated from every node. Cracks are labeled using the connected-component method, then the script computes geometrical parameters, such as width, length, area, aspect ratio and orientation. A filtering is performed using a user-defined value of aspect ratio, followed by a statistical analysis of remaining cracks. In the last part of this paper we discuss about the efficiency of this script, introducing an example of analysis of two datasets with different dimension and resolution; these analyses are performed using a notebook and a high-end professional desktop solution, in order to simulate different working environments.

  18. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  19. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  20. The Computer Manpower Evolution

    Science.gov (United States)

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  1. Elementary School Computer Literacy.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  2. My Computer Romance

    Science.gov (United States)

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  3. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  4. Students’ Choice for Computers

    Institute of Scientific and Technical Information of China (English)

    Cai; Wei

    2015-01-01

    Nowadays,computers are widely used as useful tools for our daily life.So you can see students using computers everywhere.The purpose of our survey is to find out the answers to the following questions:1.What brand of computers do students often choose?2.What is the most important factor of choosing computers in students’idea?3.What do students want to do with computers most?After that,we hope the students will know what kind of computers they really need and how many factors must be thought about when buying computers.

  5. Study on Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    Guo-Liang Chen; Guang-Zhong Sun; Yun-Quan Zhang; Ze-Yao Mo

    2006-01-01

    In this paper, we present a general survey on parallel computing. The main contents include parallel computer system which is the hardware platform of parallel computing, parallel algorithm which is the theoretical base of parallel computing, parallel programming which is the software support of parallel computing. After that, we also introduce some parallel applications and enabling technologies. We argue that parallel computing research should form an integrated methodology of "architecture - algorithm - programming - application". Only in this way, parallel computing research becomes continuous development and more realistic.

  6. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  7. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  8. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  9. Computation in Classical Mechanics

    CERN Document Server

    Timberlake, Todd

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss the ways we have used computation in our classical mechanics courses, focusing on how computational work can improve students' understanding of physics as well as their computational skills. We present examples of computational problems that serve these two purposes. In addition, we provide information about resources for instructors who would like to include computation in their courses.

  10. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  11. Cloud Computing (3)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: In the preceding two parts of this series, several aspects of cloud computing-including definition, classification, characteristics, typical applications, and service levels-were discussed. This part continues with a discussion of Cloud Computing Oopen Architecture and Market-Oriented Cloud. A comparison is made between cloud computing and other distributed computing technologies, and Google's cloud platform is analyzed to determine how distributed computing is implemented in its particular model.

  12. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  13. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  14. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  15. Ion Trap Quantum Computing

    Science.gov (United States)

    2011-12-01

    an inspiring speech at the MIT Physics of Computation 1st Conference in 1981, Feynman proposed the development of a computer that would obey the...on ion trap based 36 quantum computing for physics and computer science students would include lecture notes, slides, lesson plans, a syllabus...reading lists, videos, demonstrations, and laboratories. 37 LIST OF REFERENCES [1] R. P. Feynman , “Simulating physics with computers,” Int. J

  16. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  17. Heterogeneous Distributed Computing for Computational Aerosciences

    Science.gov (United States)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  18. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  19. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  20. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  1. Duality quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article,we make a review on the development of a newly proposed quantum computer,duality computer,or the duality quantum computer and the duality mode of quantum computers.The duality computer is based on the particle-wave duality principle of quantum mechanics.Compared to an ordinary quantum computer,the duality quantum computer is a quantum computer on the move and passing through a multi-slit.It offers more computing operations than is possible with an ordinary quantum computer.The most two distinct operations are:the quantum division operation and the quantum combiner operation.The division operation divides the wave function of a quantum computer into many attenuated,and identical parts.The combiner operation combines the wave functions in different parts into a single part.The duality mode is a way in which a quantum computer with some extra qubit resource simulates a duality computer.The main structure of duality quantum computer and duality mode,the duality mode,their mathematical description and algorithm designs are reviewed.

  2. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  3. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  4. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  5. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  6. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  7. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  8. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  9. Computer Intrusions and Attacks.

    Science.gov (United States)

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  10. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  11. Optical Quantum Computing

    National Research Council Canada - National Science Library

    Jeremy L. O'Brien

    2007-01-01

    In 2001, all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single-photon sources, linear optical elements, and single-photon detectors...

  12. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  13. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  14. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  15. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, ... the body being studied. top of page How is the procedure performed? The technologist begins by positioning ...

  16. Applying Computational Intelligence

    CERN Document Server

    Kordon, Arthur

    2010-01-01

    Offers guidelines on creating value from the application of computational intelligence methods. This work introduces a methodology for effective real-world application of computational intelligence while minimizing development cost, and outlines the critical, underestimated technology marketing efforts required

  17. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  18. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  19. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  2. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  3. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  5. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  6. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available

    Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.

    Keywords: Cloud computing, QoS, quality of cloud computing

  7. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  8. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  9. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  10. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  11. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  12. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  13. Computer system identification

    OpenAIRE

    Lesjak, Borut

    2008-01-01

    The concept of computer system identity in computer science bears just as much importance as does the identity of an individual in a human society. Nevertheless, the identity of a computer system is incomparably harder to determine, because there is no standard system of identification we could use and, moreover, a computer system during its life-time is quite indefinite, since all of its regular and necessary hardware and software upgrades soon make it almost unrecognizable: after a number o...

  14. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  15. Introduction to Quantum Computation

    Science.gov (United States)

    Ekert, A.

    A computation is a physical process. It may be performed by a piece of electronics or on an abacus, or in your brain, but it is a process that takes place in nature and as such it is subject to the laws of physics. Quantum computers are machines that rely on characteristically quantum phenomena, such as quantum interference and quantum entanglement in order to perform computation. In this series of lectures I want to elaborate on the computational power of such machines.

  16. Computational intelligence in optimization

    CERN Document Server

    Tenne, Yoel

    2010-01-01

    This volume presents a collection of recent studies covering the spectrum of computational intelligence applications with emphasis on their application to challenging real-world problems. Topics covered include: Intelligent agent-based algorithms, Hybrid intelligent systems, Cognitive and evolutionary robotics, Knowledge-Based Engineering, fuzzy sets and systems, Bioinformatics and Bioengineering, Computational finance and Computational economics, Data mining, Machine learning, and Expert systems. ""Computational Intelligence in Optimization"" is a comprehensive reference for researchers, prac

  17. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  18. Applications of membrane computing

    CERN Document Server

    Ciobanu, Gabriel; Păun, Gheorghe

    2006-01-01

    Membrane computing is a branch of natural computing which investigates computing models abstracted from the structure and functioning of living cells and from their interactions in tissues or higher-order biological structures. The models considered, called membrane systems (P systems), are parallel, distributed computing models, processing multisets of symbols in cell-like compartmental architectures. In many applications membrane systems have considerable advantages - among these are their inherently discrete nature, parallelism, transparency, scalability and nondeterminism. In dedicated cha

  19. Biomolecular computation for bionanotechnology

    CERN Document Server

    Liu, Jian-Qin

    2006-01-01

    Computers built with moleware? The drive toward non-silicon computing is underway, and this first-of-its-kind guide to molecular computation gives researchers a firm grasp of the technologies, biochemical details, and theoretical models at the cutting edge. It explores advances in molecular biology and nanotechnology and illuminates how the convergence of various technologies is propelling computational capacity beyond the limitations of traditional hardware technology and into the realm of moleware.

  20. Computably regular topological spaces

    OpenAIRE

    Weihrauch, Klaus

    2013-01-01

    This article continues the study of computable elementary topology started by the author and T. Grubba in 2009 and extends the author's 2010 study of axioms of computable separation. Several computable T3- and Tychonoff separation axioms are introduced and their logical relation is investigated. A number of implications between these axioms are proved and several implications are excluded by counter examples, however, many questions have not yet been answered. Known results on computable metr...

  1. Mobile collaborative cloudless computing

    OpenAIRE

    Cruz, Nuno Miguel Machado, 1978-

    2015-01-01

    Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2015 Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. Cloud computing is an innovative computing paradigm wh...

  2. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  3. Space Spurred Computer Graphics

    Science.gov (United States)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  4. People Shaping Educational Computing.

    Science.gov (United States)

    Blair, Marjorie; Lobello, Sharon

    1984-01-01

    Discusses contributions to educational computing of Seymour Papert, LOGO creator; Irwin Hoffman, first school-based computer education program developer; Dorothy Deringer, National Science Foundation's monitor and supporter of educational computing projects; Sherwin Steffin, educational software company vice-president; and Jessie Muse, National…

  5. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  6. Women and Computer Science.

    Science.gov (United States)

    Breene, L. Anne

    1992-01-01

    Discusses issues concerning women in computer science education, and in the workplace, and sex bias in the computer science curriculum. Concludes that computing environment has not improved for women over last 20 years. Warns that, although number of white males entering college is declining, need for scientists and engineers is not. (NB)

  7. Computers at the Crossroads.

    Science.gov (United States)

    Ediger, Marlow

    1988-01-01

    Discusses reasons for the lack of computer and software use in the classroom, especially on the elementary level. Highlights include deficiencies in available software, including lack of interaction and type of feedback; philosophies of computer use; the psychology of learning and computer use; and suggestions for developing quality software. (4…

  8. Computer Training at Harwell

    Science.gov (United States)

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  9. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  10. Computer applications in bioprocessing.

    Science.gov (United States)

    Bungay, H R

    2000-01-01

    Biotechnologists have stayed at the forefront for practical applications for computing. As hardware and software for computing have evolved, the latest advances have found eager users in the area of bioprocessing. Accomplishments and their significance can be appreciated by tracing the history and the interplay between the computing tools and the problems that have been solved in bioprocessing.

  11. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  12. Education for Computers

    Science.gov (United States)

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  13. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss th

  14. Computational Thinking Patterns

    Science.gov (United States)

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  15. Ethics and Computer Scientists.

    Science.gov (United States)

    Pulliam, Sylvia Clark

    The purpose of this study was to explore the perceptions that computer science educators have about computer ethics. The study focused on four areas: (1) the extent to which computer science educators believe that ethically inappropriate practices are taking place (both on campus and throughout society); (2) perceptions of such educators about…

  16. Deductive Computer Programming. Revision

    Science.gov (United States)

    1989-09-30

    Lecture Notes in Computer Science 354...automata", In Temporal Logic in Specification, Lecture Notes in Computer Science 398, Springer-Verlag, 1989, pp. 124-164. *[MP4] Z. Manna and A. Pnueli... Notes in Computer Science 372, Springer-Verlag, 1989, pp. 534-558. CONTRIBUTION TO BOOKS [MP5] Z. Manna and A. Pnueli, "An exercise in the

  17. The Next Computer Revolution.

    Science.gov (United States)

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  18. Computational Social Creativity.

    Science.gov (United States)

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  19. Mixing Computations and Proofs

    Directory of Open Access Journals (Sweden)

    Michael Beeson

    2016-01-01

    Full Text Available We examine the relationship between proof and computation in mathematics, especially in formalized mathematics. We compare the various approaches to proofs with a significant computational component, including (i verifying  the algorithms, (ii verifying the results of the unverified algorithms, and (iii trusting an external computation.

  20. How Computers Work: Computational Thinking for Everyone

    Directory of Open Access Journals (Sweden)

    Rex Page

    2013-01-01

    Full Text Available What would you teach if you had only one course to help students grasp the essence of computation and perhaps inspire a few of them to make computing a subject of further study? Assume they have the standard college prep background. This would include basic algebra, but not necessarily more advanced mathematics. They would have written a few term papers, but would not have written computer programs. They could surf and twitter, but could not exclusive-or and nand. What about computers would interest them or help them place their experience in context? This paper provides one possible answer to this question by discussing a course that has completed its second iteration. Grounded in classical logic, elucidated in digital circuits and computer software, it expands into areas such as CPU components and massive databases. The course has succeeded in garnering the enthusiastic attention of students with a broad range of interests, exercising their problem solving skills, and introducing them to computational thinking.

  1. The science of computing - Parallel computation

    Science.gov (United States)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  2. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  3. Scalable distributed computing hierarchy: cloud, fog and dew computing

    OpenAIRE

    Skala, Karolj; Davidović, Davor; Afgan, Enis; Sović, Ivan; Šojat, Zorislav

    2015-01-01

    The paper considers the conceptual approach for organization of the vertical hierarchical links between the scalable distributed computing paradigms: Cloud Computing, Fog Computing and Dew Computing. In this paper, the Dew Computing is described and recognized as a new structural layer in the existing distributed computing hierarchy. In the existing computing hierarchy, the Dew computing is positioned as the ground level for the Cloud and Fog computing paradigms. Vertical, complementary, hier...

  4. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  5. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  6. Perspectives in Computation

    CERN Document Server

    Geroch, Robert

    2009-01-01

    Computation is the process of applying a procedure or algorithm to the solution of a mathematical problem. Mathematicians and physicists have been occupied for many decades pondering which problems can be solved by which procedures, and, for those that can be solved, how this can most efficiently be done. In recent years, quantum mechanics has augmented our understanding of the process of computation and of its limitations. Perspectives in Computation covers three broad topics: the computation process and its limitations, the search for computational efficiency, and the role of quantum mechani

  7. Rough-Granular Computing

    Institute of Scientific and Technical Information of China (English)

    Andrzej Skowron

    2006-01-01

    Solving complex problems by multi-agent systems in distributed environments requires new approximate reasoning methods based on new computing paradigms. One such recently emerging computing paradigm is Granular Computing(GC). We discuss the Rough-Granular Computing(RGC) approach to modeling of computations in complex adaptive systems and multiagent systems as well as for approximate reasoning about the behavior of such systems. The RGC methods have been successfully applied for solving complex problems in areas such as identification of objects or behavioral patterns by autonomous systems, web mining, and sensor fusion.

  8. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  9. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  10. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  11. Replacing the computer mouse

    OpenAIRE

    Dernoncourt, Franck

    2014-01-01

    In a few months the computer mouse will be half-a-century-old. It is known to have many drawbacks, the main ones being: loss of productivity due to constant switching between keyboard and mouse, and health issues such as RSI. Like the keyboard, it is an unnatural human-computer interface. However the vast majority of computer users still use computer mice nowadays. In this article, we explore computer mouse alternatives. Our research shows that moving the mouse cursor can be done efficiently ...

  12. Computation over Mismatched Channels

    CERN Document Server

    Karamchandani, Nikhil; Diggavi, Suhas

    2012-01-01

    We consider the problem of distributed computation of a target function over a multiple-access channel. If the target and channel functions are matched (i.e., compute the same function), significant performance gains can be obtained by jointly designing the computation and communication tasks. However, in most situations there is mismatch between these two functions. In this work, we analyze the impact of this mismatch on the performance gains achievable with joint computation and communication designs over separation-based designs. We show that for most pairs of target and channel functions there is no such gain, and separation of computation and communication is optimal.

  13. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface

  14. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  15. Analogue computing methods

    CERN Document Server

    Welbourne, D

    1965-01-01

    Analogue Computing Methods presents the field of analogue computation and simulation in a compact and convenient form, providing an outline of models and analogues that have been produced to solve physical problems for the engineer and how to use and program the electronic analogue computer. This book consists of six chapters. The first chapter provides an introduction to analogue computation and discusses certain mathematical techniques. The electronic equipment of an analogue computer is covered in Chapter 2, while its use to solve simple problems, including the method of scaling is elaborat

  16. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  17. Topology for computing

    CERN Document Server

    Zomorodian, Afra J

    2005-01-01

    The emerging field of computational topology utilizes theory from topology and the power of computing to solve problems in diverse fields. Recent applications include computer graphics, computer-aided design (CAD), and structural biology, all of which involve understanding the intrinsic shape of some real or abstract space. A primary goal of this book is to present basic concepts from topology and Morse theory to enable a non-specialist to grasp and participate in current research in computational topology. The author gives a self-contained presentation of the mathematical concepts from a comp

  18. Trust Based Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LI Shiqun; Shane Balfe; ZHOU Jianying; CHEN Kefei

    2006-01-01

    Pervasive computing environment is a distributed and mobile space. Trust relationship must be established and ensured between devices and the systems in the pervasive computing environment. The trusted computing (TC) technology introduced by trusted computing group is a distributed-system-wide approach to the provisions of integrity protection of resources. The TC' notion of trust and security can be described as conformed system behaviors of a platform environment such that the conformation can be attested to a remote challenger. In this paper the trust requirements in a pervasive/ubiquitous environment are analyzed. Then security schemes for the pervasive computing are proposed using primitives offered by TC technology.

  19. Cloud Computing Technologies

    Directory of Open Access Journals (Sweden)

    Sean Carlin

    2012-06-01

    Full Text Available This paper outlines the key characteristics that cloud computing technologies possess and illustrates the cloud computing stack containing the three essential services (SaaS, PaaS and IaaS that have come to define the technology and its delivery model. The underlying virtualization technologies that make cloud computing possible are also identified and explained. The various challenges that face cloud computing technologies today are investigated and discussed. The future of cloud computing technologies along with its various applications and trends are also explored, giving a brief outlook of where and how the technology will progress into the future.

  20. Richard Feynman and computation

    Science.gov (United States)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.