WorldWideScience

Sample records for kaihatsu computer manikin

  1. Fiscal 1997 report on the results of the international standardization R and D. International standards for computers/manikins; 1997 nendo seika hokokusho kokusai hyojun soseigata kenkyu kaihatsu. Computer manikin ni kansuru kokusai hyojun kikaku

    NONE

    1998-03-01

    Through the development of computer manikins (CM) which assess human adaptability to products and environments, a draft for international standardization was worked out to propose to ISO. A draft for the international standardization was presented to ISO through a development of `a structure model` changing based on human attributes, a study of `a motion model` enabling changes in posture and movement, a study of `an evaluation model` evaluating attainment ranges and ecodynamic loads, and a development of `computer functions` realizing the above-mentioned functions. The development of CM having the following characteristics: a function to reproduce `the structure model` based on the ISO7250 human body dimensional measuring values which were regulated in items for the human body dimensional measuring, a function to change posture/movement based on the joint movable range data, a function to evaluate geometrical human adaptability such as attainment ranges. As a plug-in to Autodesk Mechanical Desktop 2.0, the above-mentioned functions were realized, and the modular structure platform was constructed which enables the wide-range cross-industry option and functional expansion by the advance of CM. 7 refs., 41 figs., 18 tabs.

  2. The Effect of Instructional Method on Cardiopulmonary Resuscitation Skill Performance: A Comparison Between Instructor-Led Basic Life Support and Computer-Based Basic Life Support With Voice-Activated Manikin.

    Wilson-Sands, Cathy; Brahn, Pamela; Graves, Kristal

    2015-01-01

    Validating participants' ability to correctly perform cardiopulmonary resuscitation (CPR) skills during basic life support courses can be a challenge for nursing professional development specialists. This study compares two methods of basic life support training, instructor-led and computer-based learning with voice-activated manikins, to identify if one method is more effective for performance of CPR skills. The findings suggest that a computer-based learning course with voice-activated manikins is a more effective method of training for improved CPR performance.

  3. Manikin Testing on LASA Suit

    Durnford, W; Potter, P

    2006-01-01

    As part of a BL2 with the Directorate of Aerospace Engineering Support DAES, DRDC Toronto required testing to be conducted on a thermal immersion manikin to evaluate the thermal resistance of the NBC...

  4. Breathing thermal manikin for indoor environment assessment: Important characteristics and requirements

    Melikov, Arsen Krikor

    2003-01-01

    Recently breathing thermal manikins have been developed and used for indoor environment measurement, evaluation and optimization as well as validation of Computational Fluid Dynamics (CFD) predictions of airflow around a human body. Advances in the assessment of occupants¿ thermal comfort...... and shape of body segments, control mode, breathing simulation, etc. are discussed and specified in this paper....

  5. Deformation of a sound field caused by a manikin

    Weinrich, Søren G.

    1981-01-01

    around the head at distances of 1 cm to 2 m, measured from the tip of the nose. The signals were pure tones at 1, 2, 4, 6, 8, and 10 kHz. It was found that the presence of the manikin caused changes in the SPL of the sound field of at most ±2.5 dB at a distance of 1 m from the surface of the manikin....... Only over an interval of approximately 20 ° behind the manikin (i.e., opposite the sound source) did the manikin cause much larger changes, up to 9 dB. These changes are caused by destructive interference between sounds coming from opposite sides of the manikin. In front of the manikin, the changes...

  6. Manikin families representing obese airline passengers in the US.

    Park, Hanjun; Park, Woojin; Kim, Yongkang

    2014-01-01

    Aircraft passenger spaces designed without proper anthropometric analyses can create serious problems for obese passengers, including: possible denial of boarding, excessive body pressures and contact stresses, postural fixity and related health hazards, and increased risks of emergency evacuation failure. In order to help address the obese passenger's accommodation issues, this study developed male and female manikin families that represent obese US airline passengers. Anthropometric data of obese individuals obtained from the CAESAR anthropometric database were analyzed through PCA-based factor analyses. For each gender, a 99% enclosure cuboid was constructed, and a small set of manikins was defined on the basis of each enclosure cuboid. Digital human models (articulated human figures) representing the manikins were created using a human CAD software program. The manikin families were utilized to develop design recommendations for selected aircraft seat dimensions. The manikin families presented in this study would greatly facilitate anthropometrically accommodating large airline passengers.

  7. Natural convection heat transfer coefficient for newborn baby - Thermal manikin assessed convective heat loses

    Ostrowski, Ziemowit; Rojczyk, Marek

    2017-11-01

    The energy balance and heat exchange for newborn baby in radiant warmer environment are considered. The present study was performed to assess the body dry heat loss from an infant in radiant warmer, using copper cast anthropomorphic thermal manikin and controlled climate chamber laboratory setup. The total body dry heat losses were measured for varying manikin surface temperatures (nine levels between 32.5 °C and 40.1 °C) and ambient air temperatures (five levels between 23.5 °C and 29.7 °C). Radiant heat losses were estimated based on measured climate chamber wall temperatures. After subtracting radiant part, resulting convective heat loses were compared with computed ones (based on Nu correlations for common geometries). Simplified geometry of newborn baby was represented as: (a) single cylinder and (b) weighted sum of 5 cylinders and sphere. The predicted values are significantly overestimated relative to measured ones by: 28.8% (SD 23.5%) for (a) and 40.9% (SD 25.2%) for (b). This showed that use of adopted general purpose correlations for approximation of convective heat losses of newborn baby can lead to substantial errors. Hence, new Nu number correlating equation is proposed. The mean error introduced by proposed correlation was reduced to 1.4% (SD 11.97%), i.e. no significant overestimation. The thermal manikin appears to provide a precise method for the noninvasive assessment of thermal conditions in neonatal care.

  8. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

    Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

    2003-01-01

    A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

  9. CFD Modeling of Thermal Manikin Heat Loss in a Comfort Evaluation Benchmark Test

    Nilsson, Håkan O.; Brohus, Henrik; Nielsen, Peter V.

    2007-01-01

    for comfort evaluation. The main idea is to focus on people. It is the comfort requirements of occupants that decide what thermal climate that will prevail. It is therefore important to use comfort simulation methods that originate from people, not just temperatures on surfaces and air.......Computer simulated persons (CSPs) today are different in many ways, reflecting various software possibilities and limitations as well as different research interest. Unfortunately, too few of the theories behind thermal manikin simulations are available in the public domain. Many researchers...

  10. Simulating Physiological Response with a Passive Sensor Manikin and an Adaptive Thermal Manikin to Predict Thermal Sensation and Comfort

    Rugh, John P [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Chaney, Larry [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hepokoski, Mark [ThermoAnalytics Inc.; Curran, Allen [ThermoAnalytics Inc.; Burke, Richard [Measurement Technology NW; Maranville, Clay [Ford Motor Company

    2015-04-14

    Reliable assessment of occupant thermal comfort can be difficult to obtain within automotive environments, especially under transient and asymmetric heating and cooling scenarios. Evaluation of HVAC system performance in terms of comfort commonly requires human subject testing, which may involve multiple repetitions, as well as multiple test subjects. Instrumentation (typically comprised of an array of temperature sensors) is usually only sparsely applied across the human body, significantly reducing the spatial resolution of available test data. Further, since comfort is highly subjective in nature, a single test protocol can yield a wide variation in results which can only be overcome by increasing the number of test replications and subjects. In light of these difficulties, various types of manikins are finding use in automotive testing scenarios. These manikins can act as human surrogates from which local skin and core temperatures can be obtained, which are necessary for accurately predicting local and whole body thermal sensation and comfort using a physiology-based comfort model (e.g., the Berkeley Comfort Model). This paper evaluates two different types of manikins, i) an adaptive sweating thermal manikin, which is coupled with a human thermoregulation model, running in real-time, to obtain realistic skin temperatures; and, ii) a passive sensor manikin, which is used to measure boundary conditions as they would act on a human, from which skin and core temperatures can be predicted using a thermophysiological model. The simulated physiological responses and comfort obtained from both of these manikin-model coupling schemes are compared to those of a human subject within a vehicle cabin compartment transient heat-up scenario.

  11. The thermal insulation difference of clothing ensembles on the dry and perspiration manikins

    Xiaohong, Zhou; Chunqin, Zheng; Yingming, Qiang; Holmér, Ingvar; Gao, Chuansi; Kuklane, Kalev

    2010-01-01

    There are about a hundred manikin users around the world. Some of them use the manikin such as 'Walter' and 'Tore' to evaluate the comfort of clothing ensembles according to their thermal insulation and moisture resistance. A 'Walter' manikin is made of water and waterproof breathable fabric 'skin', which simulates the characteristics of human perspiration. So evaporation, condensation or sorption and desorption are always accompanied by heat transfer. A 'Tore' manikin only has dry heat exchange by conduction, radiation and convection from the manikin through clothing ensembles to environments. It is an ideal apparatus to measure the thermal insulation of the clothing ensemble and allows evaluation of thermal comfort. This paper compares thermal insulation measured with dry 'Tore' and sweating 'Walter' manikins. Clothing ensembles consisted of permeable and impermeable clothes. The results showed that the clothes covering the 'Walter' manikin absorbed the moisture evaporated from the manikin. When the moisture transferred through the permeable clothing ensembles, heat of condensation could be neglected. But it was observed that heavy condensation occurred if impermeable clothes were tested on the 'Walter' manikin. This resulted in a thermal insulation difference of clothing ensembles on the dry and perspiration manikins. The thermal insulation obtained from the 'Walter' manikin has to be modified when heavy condensation occurs. The modified equation is obtained in this study

  12. The Force-Displacement Relationship in Commonly Used Resuscitation Manikins: Not Very Human

    Thomsen, Jakob E; Stærk, Mathilde; Løfgren, Bo

    2017-01-01

    Introduction: Manikins are widely used for CPR training and designed to simulate a human in cardiac arrest. Previous studies show a non-linear force-displacement relationship in the human chest. This may not be the case for resuscitation manikins. The aim of this study was to investigate the force......-displacement relationship in commonly used resuscitation manikins.Methods: Commonly used infant and adult manikins for resuscitation training were included in the study. Manikins were tested by placing them in a material testing machine (ProLine Z050, Zwick/Roell, Ulm, Germany). A piston was placed on lower half...... (Laerdal) and CPR Anytime® Infant (inflatable; American Heart Association) and five adult manikins: Mini Anne (inflatable), Little Anne®, Resusci Anne, Resusci Anne Advanced(Laerdal) and Ambu® Man (Ambu). Infant manikins required a force of 57 N and 34 N to compress the chest 3 cm. The force required...

  13. the comfort, measured by means of a sweating manikin (waltertm)

    user

    With the growing importance of clothing comfort in South African and overseas markets for locally produced clothing, the Council for. Scientific Industrial Research (CSIR) acquired an advanced sweating fabric manikin for measuring clothing comfort. This preliminary investigation covers the comfort related properties, as ...

  14. Plastic with personality: Increasing student engagement with manikins.

    Power, Tamara; Virdun, Claudia; White, Haidee; Hayes, Carolyn; Parker, Nicola; Kelly, Michelle; Disler, Rebecca; Cottle, Amanda

    2016-03-01

    Simulation allows students to practice key psychomotor skills and gain technical proficiency, fostering the development of clinical reasoning and student confidence in a low risk environment. Manikins are a valuable learning tool; yet there is a distinct lack of empirical research investigating how to enhance engagement between nursing students and manikins. To describe student perspectives of a layered, technology enhanced approach to improve the simulation learning experience. Tanner's Model of Clinical Judgment underpins the entire curriculum. This study additionally drew on the principles of narrative pedagogy. Across ten teaching weeks, five separate case studies were introduced to students through short vignettes. Students viewed the vignettes prior to their laboratory class. In the labs, manikins were dressed in the props used in the vignettes. The innovation was trialed in a second year core subject of a Bachelor of Nursing program in a large urban university in the autumn semester of 2014. Following ethics approval, students were emailed a participant information sheet. A focus group of nine students was held. The discussion was digitally recorded and transcribed verbatim prior to being subject to thematic analysis. Students' comments (143) about the vignettes in their standard subject specific student feedback surveys were also considered as data. Four themes were identified: Getting past the plastic; knowing what to say; connecting and caring; and, embracing diversity. The feedback indicated that these measures increased students ability to suspend disbelief, feel connected to, and approach the manikins in a more understanding and empathetic fashion. In addition to achieving increased engagement with manikins, other advantages such as students reflecting on their own values and pre-conceived notions of people from diverse backgrounds were realized. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Development of a research prototype computer `Wearables` that one can wear on his or her body; Minitsukeru computer `Wearables` kenkyuyo shisakuki wo kaihatsu

    NONE

    1999-02-01

    Development has been made on a prototype of a wearable computer `Wearables` that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company`s portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The `wearable computer` aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the `wearable computer` as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  16. A thermal manikin with human thermoregulatory control: implementation and validation.

    Foda, Ehab; Sirén, Kai

    2012-09-01

    Tens of different sorts of thermal manikins are employed worldwide, mainly in the evaluation of clothing thermal insulation and thermal environments. They are regulated thermally using simplified control modes. This paper reports on the implementation and validation of a new thermoregulatory control mode for thermal manikins. The new control mode is based on a multi-segmental Pierce (MSP) model. In this study, the MSP control mode was implemented, using the LabVIEW platform, onto the control system of the thermal manikin 'Therminator'. The MSP mode was then used to estimate the segmental equivalent temperature (t(eq)) along with constant surface temperature (CST) mode under two asymmetric thermal conditions. Furthermore, subjective tests under the same two conditions were carried out using 17 human subjects. The estimated segmental t(eq) from the experiments with the two modes and from the subjective assessment were compared in order to validate the use of the MSP mode for the estimation of t(eq). The results showed that the t(eq) values estimated by the MSP mode were closer to the subjective mean votes under the two test conditions for most body segments and compared favourably with values estimated by the CST mode.

  17. A Comparative Introduction on Sweating Thermal Manikin “Newton” and “Walter”

    Wang, Faming

    2008-01-01

    Recently, thermal manikins are frequently used for testing and product development by sports science and human excises field, by the building industry and by the automobile industry for evaluation of the performance of heating and ventilation systems. Multisegments thermal manikin “Newton” and onesegment thermal manikin “Walter” was described in the paper. The thermal insulation and moisture vapor resistance was briefly introduced. The advantages and disadvantages of thos...

  18. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  19. Pediatric Basic Life Support Self-training is Comparable to Instructor-led Training: A randomized manikin study

    Vestergaard, L. D.; Løfgren, Bo; Jessen, C.

    2011-01-01

    Pediatric Basic Life Support Self-training is comparable to Instructor-led Training: A randomized manikin study.......Pediatric Basic Life Support Self-training is comparable to Instructor-led Training: A randomized manikin study....

  20. Manikin for assessment of MP3 player exposure

    Hammershøi, Dorte

    2007-01-01

    Contemporary personal stereo players are compact, easy to use, and provide intense, high-quality sound that can be heard anywhere anytime. The players are reasonable in price, and have become very popular among children and adolescents. Little is known about listening habits among children...... in the belly, and a display on front that shows listening level (Laeq) according to ISO 11904-2. The scale also indicates how long one can listen at that given level without exceeding a workday exposure level of 80 dBA. The manikin has proven useful as a tool for mediation, and may even (in a revised version...

  1. The Importance of a Thermal Manikin as Source and Obstacle in Full-Scale Experiments

    Nielsen, Peter V.

    The thermal manikin is normally introduced at indoor environmental measurements to obtain detailed information on thermal comfort and air quality around a person. This paper deals with the opposite situation where manikins are introduced as sources and obstacles in order to obtain reasonable...... boundary conditions in experiments with the indoor environment. In other words, how will people influence the surroundings instead of how will the surroundings influence people? The use of thermal manikins in an experiment will of course take both situations into account, however, in some experiments...

  2. Voice advisory manikin versus instructor facilitated training in cardiopulmonary resuscitation

    Isbye, Dan L; Høiby, Pernilla; Rasmussen, Maria B

    2008-01-01

    BACKGROUND: Training of healthcare staff in cardiopulmonary resuscitation (CPR) is time-consuming and costly. It has been suggested to replace instructor facilitated (IF) training with an automated voice advisory manikin (VAM), which increases skill level by continuous verbal feedback during...... individual training. AIMS: To compare a VAM (ResusciAnne CPR skills station, Laerdal Medical A/S, Norway) with IF training in CPR using a bag-valve-mask (BVM) in terms of skills retention after 3 months. METHODS: Forty-three second year medical students were included and CPR performance (ERC Guidelines...... for Resuscitation 2005) was assessed in a 2 min test before randomisation to either IF training in groups of 8 or individual VAM training. Immediately after training and after 3 months, CPR performance was assessed in identical 2 min tests. Laerdal PC Skill Reporting System 2.0 was used to collect data. To quantify...

  3. Comparative Assessment of Torso and Seat Mounted Restraint Systems using Manikins on the Horizontal Impulse Accelerator

    2017-11-01

    ORGANIZATION ORISE** Infoscitex Corporation*** Oak Ridge Institute for Science and Education ...accelerations tested, and this was especially evident at impact accelerations greater than 10 G. The LOIS manikin head Ry angular acceleration was greater than

  4. Evaluation of the Efficiency of Liquid Cooling Garments using a Thermal Manikin

    Xu, Xiaojiang; Endrusick, Thomas; Gonzalez, Julio; Laprise, Brad; Teal, Walter; Santee, William; Kolka, Margaret

    2005-01-01

    .... personal protective equipment), and environmental conditions. Thermal manikins (TM) have been used to evaluate the performance of LCG systems and to determine the amount of heat that a LCG can extract from a TM...

  5. Comparative Assessment of Torso and Seat Mounted Restraint Systems using Manikins on the Vertical Deceleration Tower

    2017-03-01

    AFRL-RH-WP-TR-2017-0044 Comparative Assessment of Torso and Seat Mounted Restraint Systems using Manikins on the Vertical ...Restraint Systems using Manikins on the Vertical Deceleration Tower 5a. CONTRACT NUMBER FA8650-14-D-6500-0001 5b. GRANT NUMBER 5c. PROGRAM...experimental effort involving a series of +z-axis impact tests was conducted on the 711th Human Performance Wing’s Vertical Deceleration Tower (VDT

  6. Pengembangan Pintu Air Irigasi Pintar Berbasis Arduino untuk Daerah Irigasi Manikin

    Laumal, Folkes Eduward; Hattu, Edwin P; Nope, Kusa B. N

    2017-01-01

    In general, irrigation watergates placed in the Manikin Irrigation Area is supporting tools for agricultural activities that implement a primary–secondary–tertiary channel system. Manikin irrigation watergate is made of iron plates with the certain size which is operated by a move up/down or rotation. This mechanism has led the dissatisfaction service problems in farmers. This study has developed smart irrigation watergate based on Arduino by replacing the lifter/rotator part using DC motor t...

  7. Pengembangan Pintu Air Irigasi Pintar berbasis Arduino untuk Daerah Irigasi Manikin

    Folkes Eduward Laumal; Edwin P. Hattu; Kusa B. N. Nope

    2017-01-01

    In general, irrigation watergates placed in the Manikin Irrigation Area is supporting tools for agricultural activities that implement a primary–secondary–tertiary channel system. Manikin irrigation watergate is made of iron plates with the certain size which is operated by a move up/down or rotation. This mechanism has led the dissatisfaction service problems in farmers. This study has developed smart irrigation watergate based on Arduino by replacing the lifter/rotator part using DC motor t...

  8. A virtual reality dental simulator predicts performance in an operative dentistry manikin course.

    Imber, S; Shapira, G; Gordon, M; Judes, H; Metzger, Z

    2003-11-01

    This study was designed to test the ability of a virtual reality dental simulator to predict the performance of students in a traditional operative dentistry manikin course. Twenty-six dental students were pre-tested on the simulator, prior to the course. They were briefly instructed and asked to prepare 12 class I cavities which were automatically graded by the simulator. The instructors in the manikin course that followed were unaware of the students' performances in the simulator pre-test. The scores achieved by each student in the last six simulator cavities were compared to their final comprehensive grades in the manikin course. Class standing of the students in the simulator pre-test positively correlated with their achievements in the manikin course with a correlation coefficient of 0.49 (P = 0.012). Eighty-nine percent of the students in the lower third of the class in the pre-test remained in the low performing half of the class in the manikin course. These results indicate that testing students in a dental simulator, prior to a manikin course, may be an efficient way to allow early identification of those who are likely to perform poorly. This in turn could enable early allocation of personal tutors to these students in order to improve their chances of success.

  9. Report on evaluation of research and development of superhigh-function electronic computers; Chokoseino denshi keisanki no kenkyu kaihatsu ni kansuru hyoka hokokusho

    NONE

    1973-02-20

    Described herein is development of superhigh-function electronic computers.This project was implemented on a 6-year joint project, beginning in FY 1966, by the government, industrial and academic circles, with the objective to develop standard, large-size computers comparable with those of the world's highest functions by the beginning of the 70's. The computers developed by this project met almost all of the specifications of the world's representative, large-size commercial computers, partly surpassing the world's machine. In particular, integration of the virtual memory, buffer memory and multi-processor functions, which were considered to be the central technical features of the computers of the next generation, into one system was a Japan's unique concept, not seen in other countries. The other developments considered to have great ripple effects are seen in LSI's, and techniques for utilizing and mounting them and for improving their reliability. Development of magnetic discs is another notable result for the peripheral devices. Development of the input/output devices was started to correspond to inputting, outputting and reading Chinese characters, which are characteristics of Japan. The software developed has sufficient functions for common use and is considered to be the world's leading, large-size operating system, although evaluation thereof largely awaits the actual specification results. (NEDO)

  10. Measuring the thermal insulation and evaporative resistance of sleeping bags using a supine sweating fabric manikin

    Wu, Y S; Fan, Jintu

    2009-01-01

    For testing the thermal insulation of sleeping bags, standard test methods and procedures using heated manikins are provided in ASTM F1720-06 and EN 13537:2002. However, with regard to the evaporative resistance of sleeping bags, no instrument or test method has so far been established to give a direct measurement. In this paper, we report on a novel supine sweating fabric manikin system for directly measuring the evaporative resistance of sleeping bags. Eleven sleeping bags were tested using the manikin under the isothermal condition, namely, both the mean skin temperature of the manikin and that of the environment were controlled to be the same at 35 °C, with the wind speed and ambient relative humidity at 0.3 m s −1 and 50%, respectively. The results showed that the novel supine sweating fabric manikin is reproducible and accurate in directly measuring the evaporative resistance of sleeping bags, and the measured evaporative resistance can be combined with thermal insulation to calculate the moisture permeability index of sleeping bags

  11. Comparison of fabric skins for the simulation of sweating on thermal manikins

    Koelblen, Barbara; Psikuta, Agnes; Bogdan, Anna; Annaheim, Simon; Rossi, René M.

    2017-09-01

    Sweating is an important thermoregulatory process helping to dissipate heat and, thus, to prevent overheating of the human body. Simulations of human thermo-physiological responses in hot conditions or during exercising are helpful for assessing heat stress; however, realistic sweating simulation and evaporative cooling is needed. To this end, thermal manikins dressed with a tight fabric skin can be used, and the properties of this skin should help human-like sweat evaporation simulation. Four fabrics, i.e., cotton with elastane, polyester, polyamide with elastane, and a skin provided by a manikin manufacturer (Thermetrics) were compared in this study. The moisture management properties of the fabrics have been investigated in basic tests with regard to all phases of sweating relevant for simulating human thermo-physiological responses, namely, onset of sweating, fully developed sweating, and drying. The suitability of the fabrics for standard tests, such as clothing evaporative resistance measurements, was evaluated based on tests corresponding to the middle phase of sweating. Simulations with a head manikin coupled to a thermo-physiological model were performed to evaluate the overall performance of the skins. The results of the study showed that three out of four evaluated fabrics have adequate moisture management properties with regard to the simulation of sweating, which was confirmed in the coupled simulation with the head manikin. The presented tests are helpful for comparing the efficiency of different fabrics to simulate sweat-induced evaporative cooling on thermal manikins.

  12. The Inflatable Mini Anne® Manikin May be Used as an Inexpensive Alternative to a Standard Life-size Resuscitation Manikin During Instructor-led BLS/AED Training - A Randomized Controlled Study

    Bang, Camilla; Cordsen, Anna-Sophie N; Hoe, Masja B

    2017-01-01

    -led BLS/AED training. All participants underwent an end-of-course test on an AMBU® Man-manikin (AMBU). The primary endpoint: performing all steps of the European Resuscitation Council BLS/AED algorithm correctly (passing the test). Secondary endpoints: CPR quality parameters and manikin preference...

  13. Accurate feedback of chest compression depth on a manikin on a soft surface with correction for total body displacement

    Beesems, Stefanie G.; Koster, Rudolph W.

    2014-01-01

    TrueCPR is a new real-time compression depth feedback device that measures changes in magnetic field strength between a back pad and a chest pad. We determined its accuracy with a manikin on a test bench and on various surfaces. First, calibration and accuracy of the manikin and TrueCPR was verified

  14. Evaluation report on research and development of high-speed computation system for technological use; Kagaku gijutsuyo kosoku keisan system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    NONE

    1990-08-01

    The above-named project is an effort implemented under the large-scale industrial technology research and development system through the cooperation of industrial, academic, and governmental circles in the nine-year-long period beginning in fiscal 1981. The project aims to establish technologies required for putting to practical use a high-speed computation system capable of speedily dealing with huge technological problems which the computers available at the commencement of the project failed to solve. The goals set for new devices and comprehensive systems were sufficiently challenging in view of the technological level of those days, and are still at the highest level in the world. It is judged that the goals were set with reason and appropriateness. The liaison council for the implementation of the project is constituted of people of experience or academic standing, entrusted research and development activities, Ministry of International Trade and Industry bureaus concerned, and the Electrotechnical Laboratory of the same ministry. Discussion, coordination, and communication on concrete matters are under way between the constituent members, contributing to the enhancement of research and development. The liaison council activities are evaluated to be appropriate and effective. (NEDO)

  15. Influence of geometry of thermal manikins on concentration distribution and personal exposure

    Melikov, Arsen Krikor; Kaczmarczyk, Jan

    2007-01-01

    The analyses performed in this paper reveal that a breathing thermal manikin with realistic simulation of respiration including breathing cycle, pulmonary ventilation rate, frequency and breathing mode, gas concentration, humidity and temperature of exhaled air and human body shape and surface...... temperature is sensitive enough to perform reliable measurement of characteristics of air as inhaled by occupants. The temperature, humidity, and pollution concentration in the inhaled air can be measured accurately with a thermal manikin without breathing simulation if they are measured at the upper lip...... at a distance of simulation of breathing, especially of exhalation, is needed for studying the transport of exhaled air between occupants. A method...

  16. Manikin-Based Size-Resolved Penetrations of CE-marked Filtering Facepiece Respirators.

    Serfozo, N.; Ondráček, Jakub; Otáhal, P.; Lazaridis, M.; Ždímal, Vladimír

    2017-01-01

    Roč. 14, č. 12 (2017), s. 965-974 ISSN 1545-9624 EU Projects: European Commission(XE) 315760 - HEXACOMM Institutional support: RVO:67985858 Keywords : size-resolved penetration * manikin-based study * CE-marked respirator Subject RIV: CF - Physical ; Theoretical Chemistry OBOR OECD: Physical chemistry Impact factor: 1.200, year: 2016

  17. Meet TOM - the world's first open chest paediatric/adult manikin.

    Dix, Ann

    2016-07-20

    Aged 15, TOM is a model patient. He has suffered more than his fair share of life-threatening events and never complains. But then TOM is not your average sick teenager - he is the world's first open chest paediatric/adult manikin.

  18. Measurement and prediction of indoor air quality using a breathing thermal manikin.

    Melikov, A; Kaczmarczyk, J

    2007-02-01

    The analyses performed in this paper reveal that a breathing thermal manikin with realistic simulation of respiration including breathing cycle, pulmonary ventilation rate, frequency and breathing mode, gas concentration, humidity and temperature of exhaled air and human body shape and surface temperature is sensitive enough to perform reliable measurement of characteristics of air as inhaled by occupants. The temperature, humidity, and pollution concentration in the inhaled air can be measured accurately with a thermal manikin without breathing simulation if they are measured at the upper lip at a distance of measured inhaled air parameters. Proper simulation of breathing, especially of exhalation, is needed for studying the transport of exhaled air between occupants. A method for predicting air acceptability based on inhaled air parameters and known exposure-response relationships established in experiments with human subjects is suggested. Recommendations for optimal simulation of human breathing by means of a breathing thermal manikin when studying pollution concentration, temperature and humidity of the inhaled air as well as the transport of exhaled air (which may carry infectious agents) between occupants are outlined. In order to compare results obtained with breathing thermal manikins, their nose and mouth geometry should be standardized.

  19. Interpersonal Transport of Droplet Nuclei among Three Manikins in a Full-Scale Test Room

    Liu, Li; Nielsen, Peter Vilhelm; Jensen, Rasmus Lund

    2014-01-01

    This study focuses on occupants’ exposure of droplet nuclei exhaled by one susceptible in a full-scale test room. Three breathing thermal manikins are standing in the middle of room and both the process in the microenvironment and in the macroenvironment are considered. A diffusive ceiling has been...

  20. Interpersonal Transport of Expiratory Aerosols among Three Manikins in a Full-Scale Test Room

    Liu, Li; Nielsen, Peter Vilhelm; Jensen, Rasmus Lund

    2014-01-01

    This study focuses on occupants’ exposure of aerosols exhaled by one susceptible in a full-scale test room. Three breathing thermal manikins are standing in the middle of room and both the process in the microenvironment and in the macroenvironment are considered. A diffusive ceiling has been...

  1. What is the best clothing to prevent heat and cold stress? Experiences with thermal manikin.

    Magyar, Z; Tamas, R

    2013-02-01

    The present study summarizes the current knowledge of the heat and cold stress which might significantly affect military activities and might also occur among travellers who are not well adapted to weather variations during their journey. The selection of the best clothing is a very important factor in preserving thermal comfort. Our experiences with thermal manikin are also represented in this paper.

  2. Airflow characteristics and pollution distribution around a thermal manikin - Impact of specific personal and indoor environmental factors

    Licina, Dusan; Tham, Kwok Wai; Melikov, Arsen Krikor

    2016-01-01

    , and ventilation flow considerably affected airflow characteristics and pollution distribution around the thermal manikin. Under the specific set of conditions studied, the most favorable airflow patterns in preventing the feet pollution from reaching the breathing zone was transverse flow from the front......This study presents a summary of experimental measurements on the airflow characteristics and pollution distribution around a non-breathing thermal manikin. The two objectives are: (1) to examine the extent to which personal (body posture, clothing insulation, table positioning) and environmental...... factors (room air temperature and ventilation flow) affect the airflow characteristic (velocity and temperature) around the thermal manikin and (2) to examine the pollution distribution within the convective boundary layer (CBL) around a thermal manikin and personal exposure to two types of airborne...

  3. Evaluating local and overall thermal comfort in buildings using thermal manikins

    Foda, E.

    2012-07-01

    Evaluation methods of human thermal comfort that are based on whole-body heat balance with its surroundings may not be adequate for evaluations in non-uniform thermal conditions. Under these conditions, the human body's segments may experience a wide range of room physical parameters and the evaluation of the local (segmental) thermal comfort becomes necessary. In this work, subjective measurements of skin temperature were carried out to investigate the human body's local responses due to a step change in the room temperature; and the variability in the body's local temperatures under different indoor conditions and exposures as well as the physiological steady state local temperatures. Then, a multi-segmental model of human thermoregulation was developed based on these findings to predict the local skin temperatures of individuals' body segments with a good accuracy. The model predictability of skin temperature was verified for steady state and dynamic conditions using measured data at uniform neutral, cold and warm as well as different asymmetric thermal conditions. The model showed very good predictability with average absolute deviation ranged from 0.3-0.8 K. The model was then implemented onto the control system of the thermal manikin 'THERMINATOR' to adjust the segmental skin temperature set-points based on the indoor conditions. This new control for the manikin was experimentally validated for the prediction of local and overall thermal comfort using the equivalent temperature measure. THERMINATOR with the new control mode was then employed in the evaluation of localized floor-heating system variants towards maximum energy efficiency. This aimed at illustrating a design strategy using the thermal manikin to find the optimum geometry and surface area of a floor-heater for a single seated person. Furthermore, a psychological comfort model that is based on local skin temperature was adapted for the use with the model of human

  4. Development of a research prototype computer 'Wearables' that one can wear on his or her body. Minitsukeru computer 'Wearables' kenkyuyo shisakuki wo kaihatsu

    1999-02-01

    Development has been made on a prototype of a wearable computer 'Wearables' that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company's portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The 'wearable computer' aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the 'wearable computer' as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  5. Measurement and prediction of indoor air quality using a breathing thermal manikin

    Melikov, Arsen Krikor; Kaczmarczyk, J.

    2007-01-01

    temperature is sensitive enough to perform reliable measurement of characteristics of air as inhaled by occupants. The temperature, humidity, and pollution concentration in the inhaled air can be measured accurately with a thermal manikin without breathing simulation if they are measured at the upper lip...... at a distance of measured inhaled air parameters. Proper simulation of breathing, especially of exhalation, is needed for studying the transport of exhaled air between occupants. A method......The analyses performed in this paper reveal that a breathing thermal manikin with realistic simulation of respiration including breathing cycle, pulmonary ventilation rate, frequency and breathing mode, gas concentration, humidity and temperature of exhaled air and human body shape and surface...

  6. Laypersons may learn basic life support in 24min using a personal resuscitation manikin

    Isbye, Dan Lou; Rasmussen, Lars Simon; Lippert, Freddy Knudsen

    2006-01-01

    -training. The second group attended a conventional 6 h BLS course (6 HR). After 3 months BLS skills were assessed on a Laerdal ResusciAnne manikin using the Laerdal PC Skill reporting System, and a total score was calculated. RESULTS: There was no significant difference between groups in BLS performance using...... assessed after 3 months, a 24 min DVD-based instruction plus subsequent self-training in BLS appears equally effective compared to a 6h BLS course and hence is more efficient. Udgivelsesdato: 2006-Jun...... and the challenge is to find the most efficient one. AIMS: To compare the efficiency of a 24 min instruction using a DVD-based self-training BLS course combined with a simple, take-home resuscitation manikin to a conventional 6h course for teaching BLS to laypersons. METHODS: In total, 238 laypersons (age 21...

  7. The impact of chest compression rates on quality of chest compressions : a manikin study

    Field, Richard A.; Soar, Jasmeet; Davies, Robin P.; Akhtar, Naheed; Perkins, Gavin D.

    2012-01-01

    Purpose\\ud Chest compressions are often performed at a variable rate during cardiopulmonary resuscitation (CPR). The effect of compression rate on other chest compression quality variables (compression depth, duty-cycle, leaning, performance decay over time) is unknown. This randomised controlled cross-over manikin study examined the effect of different compression rates on the other chest compression quality variables.\\ud Methods\\ud Twenty healthcare professionals performed two minutes of co...

  8. Polymeric Materials Models in the Warrior Injury Assessment Manikin (WIAMan) Anthropomorphic Test Device (ATD) Tech Demonstrator

    2017-01-01

    analytical model currently used by military vehicle analysts has been continuously updated to address the model’s inherent deficiencies and make the... model is a hyperelastic polymer model based upon statistical mechanics and the finite extensibility of a polymer chain.23 Its rheological ...ARL-TR-7927 ● JAN 2017 US Army Research Laboratory Polymeric Materials Models in the Warrior Injury Assessment Manikin (WIAMan

  9. MANIKIN DEMONSTRATION IN TEACHING CONSERVATIVE MANAGEMENT OF POSTPARTUM HAEMORRHAGE: A COMPARISON WITH CONVENTIONAL METHODS

    Sathi Mangalam Saraswathi

    2016-07-01

    Full Text Available BACKGROUND Even though there are many innovative methods to make classes more interesting and effective, in my department, topics are taught mainly by didactic lectures. This study attempts to compare the effectiveness of manikin demonstration and didactic lectures in teaching conservative management of post-partum haemorrhage. OBJECTIVE To compare the effectiveness of manikin demonstration and didactic lectures in teaching conservative management of postpartum haemorrhage. MATERIALS AND METHODS This is an observational study. Eighty four ninth-semester MBBS students posted in Department of Obstetrics and Gynaecology, Government Medical College, Kottayam were selected. They were divided into 2 groups by lottery method. Pre-test was conducted for both groups. Group A was taught by manikin demonstration. Group B was taught by didactic lecture. Feedback response from the students collected after demonstration class was analysed. Post-test was conducted for both the groups after one week. Gain in knowledge of both the groups were calculated from pre-test and post-test scores and compared by Independent sample t test. RESULTS The mean gain in knowledge in group A was 6.4 when compared to group B which is 4.3 and the difference was found to be statistically significant. All of the students in group A felt satisfied and more confident after the class and wanted more topics to be taken by demonstration. CONCLUSION Manikin demonstration class is more effective in teaching conservative management of post-partum haemorrhage and this method can be adopted to teach similar topics in clinical subjects.

  10. Development of MATLAB Scripts for the Calculation of Thermal Manikin Regional Resistance Values

    2016-01-01

    TECHNICAL NOTE NO. TN16-1 DATE January 2016 ADA DEVELOPMENT OF MATLAB ® SCRIPTS FOR THE...USARIEM TECHNICAL NOTE TN16-1 DEVELOPMENT OF MATLAB ® SCRIPTS FOR THE CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES...EXECUTIVE SUMMARY A software tool has been developed via MATLAB ® scripts to reduce the amount of repetitive and time-consuming calculations that are

  11. Disseminating cardiopulmonary resuscitation training by distributing 9,200 personal manikins.

    de Paiva, Edison Ferreira; Padilha, Roberto de Queiroz; Sgobero, Jenny Karol Gomes Sato; Ganem, Fernando; Cardoso, Luiz Francisco

    2014-08-01

    Community members should be trained so that witnesses of cardiac arrests are able to trigger the emergency system and perform adequate resuscitation. In this study, the authors evaluated the results of cardiopulmonary resuscitation (CPR) training of communities in four Brazilian cities, using personal resuscitation manikins. In total, 9,200 manikins were distributed in Apucarana, Itanhaém, Maringá, and São Carlos, which are cities where the populations range from 80,000 to 325,000 inhabitants. Elementary and secondary school teachers were trained on how to identify a cardiac arrest, trigger the emergency system, and perform chest compressions. The teachers were to transfer the training to their students, who would then train their families and friends. In total, 49,131 individuals were trained (6.7% of the population), but the original strategy of using teachers and students as multipliers was responsible for only 27.9% of the training. A total of 508 teachers were trained, and only 88 (17.3%) transferred the training to the students. Furthermore, the students have trained only 45 individuals of the population. In Maringá and São Carlos, the strategy was changed and professionals in the primary health care system were prepared and used as multipliers. This strategy proved extremely effective, especially in Maringá, where 39,041 individuals were trained (79.5% of the total number of trainings). Community health care providers were more effective in passing the training to students than the teachers (odds ratio [OR] = 7.12; 95% confidence interval [CI] = 4.74 to 10.69; p CPR using personal manikins by professionals in the primary health care system seems to be a more efficient strategy for training the community than creating a training network in the schools. © 2014 by the Society for Academic Emergency Medicine.

  12. A new suction mask to reduce leak during neonatal resuscitation: a manikin study.

    Lorenz, Laila; Maxfield, Dominic A; Dawson, Jennifer A; Kamlin, C Omar F; McGrory, Lorraine; Thio, Marta; Donath, Susan M; Davis, Peter G

    2016-09-01

    Leak around the face mask is a common problem during neonatal resuscitation. A newly designed face mask using a suction system to enhance contact between the mask and the infant's face might reduce leak and improve neonatal resuscitation. The aim of the study is to determine whether leak is reduced using the suction mask (Resusi-sure mask) compared with a conventional mask (Laerdal Silicone mask) in a manikin model. Sixty participants from different professional categories (neonatal consultants, fellows, registrars, nurses, midwives and students) used each face mask in a random order to deliver 2 min of positive pressure ventilation to a manikin. Delivered airway pressures were measured using a pressure line. Inspiratory and expiratory flows were measured using a flow sensor, and expiratory tidal volumes and mask leaks were derived from these values. A median (IQR) leak of 12.1 (0.6-39.0)% was found with the conventional mask compared with 0.7 (0.2-4.6)% using the suction mask (p=0.002). 50% of the participants preferred to use the suction mask and 38% preferred to use the conventional mask. There was no correlation between leak and operator experience. A new neonatal face mask based on the suction system reduced leak in a manikin model. Clinical studies to test the safety and effectiveness of this mask are needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. A comparison of four techniques of emergency transcricoid oxygenation in a manikin.

    Salah, Nazar

    2012-02-01

    Cricothyroidotomy is the final rescue maneuver in difficult airway management. We compared 4 techniques of oxygenation via the cricothyroid membrane in a manikin. The techniques were wire guided, trocar, cannula with jet ventilation, and blade technique (scalpel with endotracheal tube). In the wire-guided group, the time taken to ventilation was slower on all attempts, and there were no successful attempts in <40 seconds. There were no differences between the other groups at any time. Time to ventilation improved with repetition in all groups. Skills were retained at 1 month.

  14. A comparison of four techniques of emergency transcricoid oxygenation in a manikin.

    Salah, Nazar

    2010-04-01

    Cricothyroidotomy is the final rescue maneuver in difficult airway management. We compared 4 techniques of oxygenation via the cricothyroid membrane in a manikin. The techniques were wire guided, trocar, cannula with jet ventilation, and blade technique (scalpel with endotracheal tube). In the wire-guided group, the time taken to ventilation was slower on all attempts, and there were no successful attempts in <40 seconds. There were no differences between the other groups at any time. Time to ventilation improved with repetition in all groups. Skills were retained at 1 month.

  15. Measurement of Indoor Air Quality by Means of a Breathing Thermal Manikin

    Brohus, Henrik

    When a person is located in a contaminant field with significant gradients the contaminant distribution is modified locally due to the entrainment and transport of room air in the human convective boundary layer as well as due to the effect of the person acting as an obstacle to the flow field, etc....... The local modification of the concentration distribution may affect the personal exposure significantly and, thus, the indoor air quality actually experienced. In this paper measurements of indoor air quality by means of a Breathing Thermal Manikin (BTM) are presented....

  16. Calibration in a manikin of a high dose rate equipment by remote charge

    Alfonso La Guardia, Rodolfo; Toledo Jimenez, Pablo; Pich Leon, Victor

    1996-01-01

    The aim of this study was to know the immediate results obtained with laparoscopic cholecystectomy in The use of High Dose Rate brachytherapy in Cuba has been limited to AGAT-V Soviet installations. In order to calibrate one of these installations for its clinical use, it was developed a procedure based on the direct measurement of the dose absorbed in referral point B of a paraffin manikin. Results obtained as a result of calibration are shown. According to these results, it was carried out an evaluation of the effective doses administered on prescription point A by using the linear quadratic model

  17. Influence of chest compression rate guidance on the quality of cardiopulmonary resuscitation performed on manikins.

    Jäntti, H; Silfvast, T; Turpeinen, A; Kiviniemi, V; Uusaro, A

    2009-04-01

    The adequate chest compression rate during CPR is associated with improved haemodynamics and primary survival. To explore whether the use of a metronome would affect also chest compression depth beside the rate, we evaluated CPR quality using a metronome in a simulated CPR scenario. Forty-four experienced intensive care unit nurses participated in two-rescuer basic life support given to manikins in 10min scenarios. The target chest compression to ventilation ratio was 30:2 performed with bag and mask ventilation. The rescuer performing the compressions was changed every 2min. CPR was performed first without and then with a metronome that beeped 100 times per minute. The quality of CPR was analysed with manikin software. The effect of rescuer fatigue on CPR quality was analysed separately. The mean compression rate between ventilation pauses was 137+/-18compressions per minute (cpm) without and 98+/-2cpm with metronome guidance (pmetronome (pmetronome guidance (p=0.09). The total number of chest compressions performed was 1022 without metronome guidance, 42% at the correct depth; and 780 with metronome guidance, 61% at the correct depth (p=0.09 for difference for percentage of compression with correct depth). Metronome guidance corrected chest compression rates for each compression cycle to within guideline recommendations, but did not affect chest compression quality or rescuer fatigue.

  18. Comparison of face masks in the bag-mask ventilation of a manikin.

    Redfern, D; Rassam, S; Stacey, M R; Mecklenburgh, J S

    2006-02-01

    We conducted a study investigating the effectiveness of four face mask designs in the bag-mask ventilation of a special manikin adapted to simulate a difficult airway. Forty-eight anaesthetists volunteered to bag-mask ventilate the manikin for 3 min with four different face masks. The primary outcome of the study was to calculate mean percentage leak from the face masks over 3 min. Anaesthetists were also asked to rate the face masks using a visual analogue score. The single-use scented intersurgical face mask had the lowest mean leak (20%). This was significantly lower than the mean leak from the single-use, cushioned 7,000 series Air Safety Ltd. face mask (24%) and the reusable silicone Laerdal face mask (27%) but not significantly lower than the mean leak from the reusable anatomical intersurgical face mask (23%). There was a large variation in both performance and satisfaction between anaesthetists with each design. This highlights the importance of having a variety of face masks available for emergency use.

  19. An assessment of the realism of digital human manikins used for simulation in ergonomics.

    Nérot, Agathe; Skalli, Wafa; Wang, Xuguang

    2015-01-01

    In this study, the accuracy of the joint centres of the manikins generated by RAMSIS and Human Builder (HB), two digital human modelling (DHM) systems widely used in industry for virtual ergonomics simulation, was investigated. Eighteen variously sized females and males were generated from external anthropometric dimensions and six joint centres (knee, hip and four spine joints) were compared with their anatomic locations obtained from the three-dimensional reconstructed bones from a low-dose X-ray system. Both RAMSIS and HB could correctly reproduce external anthropometric dimensions, while the estimation of internal joint centres location presented an average error of 27.6 mm for HB and 38.3 mm for RAMSIS. Differences between both manikins showed that a more realistic kinematic linkage led to better accuracy in joint location. This study opens the way to further research on the relationship between the external body geometry and internal skeleton in order to improve the realism of the internal skeleton of DHMs, especially for a biomechanical analysis requiring information of joint load and muscle force estimation. This study assessed two digital human modelling (DHM) systems widely used in industry for virtual ergonomics. Results support the need of a more realistic human modelling, especially for a biomechanical analysis and a standardisation of DHMs.

  20. Pengembangan Pintu Air Irigasi Pintar berbasis Arduino untuk Daerah Irigasi Manikin

    Folkes Eduward Laumal

    2017-12-01

    Full Text Available In general, irrigation watergates placed in the Manikin Irrigation Area is supporting tools for agricultural activities that implement a primary–secondary–tertiary channel system. Manikin irrigation watergate is made of iron plates with the certain size which is operated by a move up/down or rotation. This mechanism has led the dissatisfaction service problems in farmers. This study has developed smart irrigation watergate based on Arduino by replacing the lifter/rotator part using DC motor that works automatically based on the Real-time Clock sensor. This sensor sends the data time to Arduino and used as the reference to open or close the watergate. The study used a design method includes interconnecting realtime clock sensors and Arduino, build the programming control, build the DC system on watergates and interconnection to control systems and testing. The test results show that the irrigation watergate moves up and move down every 2 hours based on the data time from the real-time clock, works with a 12-hour time format and operating on 2.7-ampere current.

  1. Videolaryngoscopes differ substantially in illumination of the oral cavity: A manikin study

    Barbe MA Pieters

    2016-01-01

    Full Text Available Background and Aims: Insufficient illumination of the oral cavity during endotracheal intubation may result in suboptimal conditions. Consequently, suboptimal illumination and laryngoscopy may lead to potential unwanted trauma to soft tissues of the pharyngeal mucosa. We investigated illumination of the oral cavity by different videolaryngoscopes (VLS in a manikin model. Methods: We measured light intensity from the mouth opening of a Laerdal intubation trainer comparing different direct and indirect VLS at three occasions, resembling optimal to less-than-optimal intubation conditions; at the photographer′s dark room, in an operating theatre and outdoors in bright sunlight. Results: Substantial differences in luminance were detected between VLS. The use of LED light significantly improved light production. All VLS produced substantial higher luminance values in a well-luminated environment compared to the dark photographer′s room. The experiments outside-in bright sunlight-were interfered with by direct sunlight penetration through the synthetic material of the manikin, making correct measurement of luminance in the oropharynx invalid. Conclusion: Illumination of the oral cavity differs widely among direct and indirect VLS. The clinician should be aware of the possibility of suboptimal illumination of the oral cavity and the potential risk this poses for the patient.

  2. Performance study of protective clothing against hot water splashes: from bench scale test to instrumented manikin test.

    Lu, Yehu; Song, Guowen; Wang, Faming

    2015-03-01

    Hot liquid hazards existing in work environments are shown to be a considerable risk for industrial workers. In this study, the predicted protection from fabric was assessed by a modified hot liquid splash tester. In these tests, conditions with and without an air spacer were applied. The protective performance of a garment exposed to hot water spray was investigated by a spray manikin evaluation system. Three-dimensional body scanning technique was used to characterize the air gap size between the protective clothing and the manikin skin. The relationship between bench scale test and manikin test was discussed and the regression model was established to predict the overall percentage of skin burn while wearing protective clothing. The results demonstrated strong correlations between bench scale test and manikin test. Based on these studies, the overall performance of protective clothing against hot water spray can be estimated on the basis of the results of the bench scale hot water splashes test and the information of air gap size entrapped in clothing. The findings provide effective guides for the design and material selection while developing high performance protective clothing. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2014.

  3. Disseminating cardiopulmonary resuscitation training by distributing 35,000 personal manikins among school children

    Isbye, Dan L; Rasmussen, Lars S; Ringsted, Charlotte

    2007-01-01

    BACKGROUND: Because most cardiac arrests occur at home, widespread training is needed to increase the incidence of cardiopulmonary resuscitation (CPR) by lay persons. The aim of this study was to evaluate the effect of mass distribution of CPR instructional materials among schoolchildren. METHODS...... AND RESULTS: We distributed 35,002 resuscitation manikins to pupils (12 to 14 years of age) at 806 primary schools. Using the enclosed 24-minute instructional DVD, they trained in CPR and subsequently used the kit to train family and friends (second tier). They completed a questionnaire on who had trained...... in CPR using the kit. Teachers also were asked to evaluate the project. The incidence of bystander CPR in out-of-hospital cardiac arrest in the months following the project was compared with the previous year. In total, 6947 questionnaires (19.8%) were returned. The 6947 kits had been used to train 17...

  4. Computational modeling of particle transport and distribution emitted from a Laserjet printer in a ventilated room with different ventilation configurations

    Ansaripour, Mehrzad; Abdolzadeh, Morteza; Sargazizadeh, Saleh

    2016-01-01

    Highlights: • The distribution of emitted particles form a laserjet printer was studied in the breathing zone. • Effects of different ventilation configurations on the breathing zone concentration were investigated. • Mixing ventilation system has a low mean particle concentration in the breathing zone. - Abstract: In the present research, computational modeling of particle transport and distribution emitted from a Laserjet printer was carried out in a ventilated room. A seated manikin was integrated into the study room and the manikin was evaluated in two cases: heated and unheated. Effects of different ventilation configurations of the room on the particle distribution were studied, including three displacement ventilation systems and a mixing ventilation system. The printer was located on different sides of the manikin and the particle concentrations in the breathing zone of the manikin due to the printer’s particles were evaluated in all the ventilation configurations. The averaged particle concentration in the breathing zone of the manikin was calculated and validated with the experimental and numerical data available in the literature. The results of the present study showed that in case of the heated manikin, the particle concentration due to the printer pollutants is significant in the breathing zone of the manikin. The results also showed that when the printer is located on the front side of the manikin, the particle concentration in the breathing zone is quite high in most of the used ventilation configurations. Furthermore, it was found that the mixing ventilation system has a lower mean particle concentration in the breathing zone compared to the most displacement ventilation systems.

  5. The effect of differing support surfaces on the efficacy of chest compressions using a resuscitation manikin model.

    Tweed, M; Tweed, C; Perkins, G D

    2001-11-01

    External chest compression (ECC) efficacy is influenced by factors including the surface supporting the patient. Air-filled support surfaces are deflated for cardiopulmonary resuscitation, with little evidence to substantiate this. We investigated the effect that differing support surfaces had on ECC efficacy using a CPR manikin model. Four participants carried out four cycles of ECC with an assistant ventilating. The subjects were blinded to the seven support surfaces and the order was randomised. For each participant/surface combination, ECC variables and the participants' perceptions were measured. Participants produced effective ECC with the manikin on the floor (mean proportion correct, 94.5%; mean depth, 42.5 mm). Compared with the floor: the proportion of correct ECC was less for the overlay inflated (PCPR.

  6. An Appropriate Compression Pace is Important for Securing the Quality of Hands-only CPR : A manikin study

    Shimizu, Yoshitaka; Tanigawa, Koichi; Ishikawa, Masami; Ouhara, Kazuhisa; Oue, Kana; Yoshinaka, Taiga; Kurihara, Hidemi; Irifune, Masahiro

    2014-01-01

    It is important to implement good quality chest compressions for cardiopulmonary resuscitation (CPR). This manikin study examined the effects of different compression rates on chest compression depth variables using a metronome sound guide. Fifty sixth-year dentistry students participated in the study. Each participant performed CPR at 3 different compression rates, 110, 100, and 90 compressions per min (pace-110-g, pace-100-g, and pace-90-g) for 2 consecutive one-minute sets with a ten-secon...

  7. Two-thumb technique is superior to two-finger technique during lone rescuer infant manikin CPR.

    Udassi, Sharda; Udassi, Jai P; Lamb, Melissa A; Theriaque, Douglas W; Shuster, Jonathan J; Zaritsky, Arno L; Haque, Ikram U

    2010-06-01

    Infant CPR guidelines recommend two-finger chest compression with a lone rescuer and two-thumb with two rescuers. Two-thumb provides better chest compression but is perceived to be associated with increased ventilation hands-off time. We hypothesized that lone rescuer two-thumb CPR is associated with increased ventilation cycle time, decreased ventilation quality and fewer chest compressions compared to two-finger CPR in an infant manikin model. Crossover observational study randomizing 34 healthcare providers to perform 2 min CPR at a compression rate of 100 min(-1) using a 30:2 compression:ventilation ratio comparing two-thumb vs. two-finger techniques. A Laerdal Baby ALS Trainer manikin was modified to digitally record compression rate, compression depth and compression pressure and ventilation cycle time (two mouth-to-mouth breaths). Manikin chest rise with breaths was video recorded and later reviewed by two blinded CPR instructors for percent effective breaths. Data (mean+/-SD) were analyzed using a two-tailed paired t-test. Significance was defined qualitatively as pCPR, but there was no significant difference in percent effective breaths delivered between the two techniques. Two-thumb CPR had 4 fewer delivered compressions per minute, which may be offset by far more effective compression depth and compression pressure compared to two-finger technique. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. The impact of chest compression rates on quality of chest compressions - a manikin study.

    Field, Richard A; Soar, Jasmeet; Davies, Robin P; Akhtar, Naheed; Perkins, Gavin D

    2012-03-01

    Chest compressions are often performed at a variable rate during cardiopulmonary resuscitation (CPR). The effect of compression rate on other chest compression quality variables (compression depth, duty-cycle, leaning, performance decay over time) is unknown. This randomised controlled cross-over manikin study examined the effect of different compression rates on the other chest compression quality variables. Twenty healthcare professionals performed 2 min of continuous compressions on an instrumented manikin at rates of 80, 100, 120, 140 and 160 min(-1) in a random order. An electronic metronome was used to guide compression rate. Compression data were analysed by repeated measures ANOVA and are presented as mean (SD). Non-parametric data was analysed by Friedman test. At faster compression rates there were significant improvements in the number of compressions delivered (160(2) at 80 min(-1) vs. 312(13) compressions at 160 min(-1), P<0.001); and compression duty-cycle (43(6)% at 80 min(-1) vs. 50(7)% at 160 min(-1), P<0.001). This was at the cost of a significant reduction in compression depth (39.5(10)mm at 80 min(-1) vs. 34.5(11)mm at 160 min(-1), P<0.001); and earlier decay in compression quality (median decay point 120 s at 80 min(-1) vs. 40s at 160 min(-1), P<0.001). Additionally not all participants achieved the target rate (100% at 80 min(-1) vs. 70% at 160 min(-1)). Rates above 120 min(-1) had the greatest impact on reducing chest compression quality. For Guidelines 2005 trained rescuers, a chest compression rate of 100-120 min(-1) for 2 min is feasible whilst maintaining adequate chest compression quality in terms of depth, duty-cycle, leaning, and decay in compression performance. Further studies are needed to assess the impact of the Guidelines 2010 recommendation for deeper and faster chest compressions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Comparison of the quality of chest compressions on a dressed versus an undressed manikin: A controlled, randomised, cross-over simulation study

    Brindley Peter G

    2010-03-01

    Full Text Available Abstract Background Undressing the chest of a cardiac arrest victim may delay the initiation of chest compressions. Furthermore, expecting laypeople to undress the chest may increase bystander reluctance to perform cardiopulmonary resuscitation (CPR. Both of these factors might conceivably decrease survival following cardiac arrest. Therefore, the aim of this study was to examine if the presence or absence of clothes affected the quality of chest compressions during CPR on a simulator manikin. Methods Thirty laypeople and 18 firefighters were randomised to start CPR on the thorax of a manikin that was either clothed (three layers or not. Data were obtained via recordings from the manikin and audio- and video-recordings. Measurements were: maximum compression depth; compression rate; percentage of compressions with correct hand positioning; percentage of compressions with complete release (≤ 10 mm, and percentage of compressions of the correct depth (range 40-50 mm. Laypeople were given a four-hour European Resuscitation Council standardised course in basic life support and tested immediately after. Firefighters were tested without additional training. Mock cardiac arrest scenarios consisted of three minutes of CPR separated by 15 minutes of rest. Results No significant differences were found between CPR performed on an undressed manikin compared to a dressed manikin, for laypeople or firefighters. However, undressing the manikin was associated with a mean delay in the initiation of chest compressions by laypeople of 23 seconds (N = 15, 95% CI: 19;27. Conclusions In this simulator manikin study, there was no benefit gained in terms of how well CPR was performed by undressing the thorax. Furthermore, undressing the thorax delayed initiation of CPR by laypeople, which might be clinically detrimental for survival.

  10. Feasibility of Augmented Reality in Clinical Simulations: Using Google Glass With Manikins.

    Chaballout, Basil; Molloy, Margory; Vaughn, Jacqueline; Brisson Iii, Raymond; Shaw, Ryan

    2016-03-07

    Studies show that students who use fidelity-based simulation technology perform better and have higher retention rates than peers who learn in traditional paper-based training. Augmented reality is increasingly being used as a teaching and learning tool in a continual effort to make simulations more realistic for students. The aim of this project was to assess the feasibility and acceptability of using augmented reality via Google Glass during clinical simulation scenarios for training health science students. Students performed a clinical simulation while watching a video through Google Glass of a patient actor simulating respiratory distress. Following participation in the scenarios students completed two surveys and were questioned if they would recommend continued use of this technology in clinical simulation experiences. We were able to have students watch a video in their field of vision of a patient who mimicked the simulated manikin. Students were overall positive about the implications for being able to view a patient during the simulations, and most students recommended using the technology in the future. Overall, students reported perceived realism with augmented reality using Google Glass. However, there were technical and usability challenges with the device. As newer portable and consumer-focused technologies become available, augmented reality is increasingly being used as a teaching and learning tool to make clinical simulations more realistic for health science students. We found Google Glass feasible and acceptable as a tool for augmented reality in clinical simulations.

  11. Optimal chest compression rate in cardiopulmonary resuscitation: a prospective, randomized crossover study using a manikin model.

    Lee, Seong Hwa; Ryu, Ji Ho; Min, Mun Ki; Kim, Yong In; Park, Maeng Real; Yeom, Seok Ran; Han, Sang Kyoon; Park, Seong Wook

    2016-08-01

    When performing cardiopulmonary resuscitation (CPR), the 2010 American Heart Association guidelines recommend a chest compression rate of at least 100 min, whereas the 2010 European Resuscitation Council guidelines recommend a rate of between 100 and 120 min. The aim of this study was to examine the rate of chest compression that fulfilled various quality indicators, thereby determining the optimal rate of compression. Thirty-two trainee emergency medical technicians and six paramedics were enrolled in this study. All participants had been trained in basic life support. Each participant performed 2 min of continuous compressions on a skill reporter manikin, while listening to a metronome sound at rates of 100, 120, 140, and 160 beats/min, in a random order. Mean compression depth, incomplete chest recoil, and the proportion of correctly performed chest compressions during the 2 min were measured and recorded. The rate of incomplete chest recoil was lower at compression rates of 100 and 120 min compared with that at 160 min (P=0.001). The numbers of compressions that fulfilled the criteria for high-quality CPR at a rate of 120 min were significantly higher than those at 100 min (P=0.016). The number of high-quality CPR compressions was the highest at a compression rate of 120 min, and increased incomplete recoil occurred with increasing compression rate. However, further studies are needed to confirm the results.

  12. Tracheal intubation by inexperienced medical residents using the Airtraq and Macintosh laryngoscopes--a manikin study.

    Maharaj, Chrisen H

    2006-11-01

    The Airtraq laryngoscope is a novel intubation device that may possess advantages over conventional direct laryngoscopes for use by personnel that are infrequently required to perform tracheal intubation. We conducted a prospective study in 20 medical residents with little prior airway management experience. After brief didactic instruction, each participant took turns performing laryngoscopy and intubation using the Macintosh (Welch Allyn, Welch Allyn, NY) and Airtraq (Prodol Ltd. Vizcaya, Spain) devices, in 3 laryngoscopy scenarios in a Laerdal Intubation Trainer (Laerdal, Stavanger, Norway) and 1 scenario in a Laerdal SimMan manikin (Laerdal, Kent, UK). They then performed tracheal intubation of the normal airway a second time to characterize the learning curve. In all scenarios tested, the Airtraq decreased the duration of intubation attempts, reduced the number of optimization maneuvers required, and reduced the potential for dental trauma. The residents found the Airtraq easier to use in all scenarios compared with the Macintosh laryngoscope. The Airtraq may constitute a superior device for use by personnel infrequently required to perform tracheal intubation.

  13. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  14. Evaluation report on research and development of a database system for mutual computer operation; Denshi keisanki sogo un'yo database system no kenkyu kaihatsu ni kansuru hyoka hokokusho

    NONE

    1992-03-01

    This paper describes evaluation on the research and development of a database system for mutual computer operation, with respect to discrete database technology, multi-media technology, high reliability technology, and mutual operation network system technology. A large number of research results placing the views on the future were derived, such as the issues of discretion and utilization patterns of the discrete database, structuring of data for multi-media information, retrieval systems, flexible and high-level utilization of the network, and the issues in database protection. These achievements are publicly disclosed widely. The largest feature of this project is in aiming at forming a network system that can be operated mutually under multi-vender environment. Therefore, the researches and developments have been executed under the spirit of the principle of openness to public and international cooperation. These efforts are represented by organizing the rule establishment committee, execution of mutual interconnection experiment (including demonstration evaluation), and development of the mounting rules based on the ISO's 'open system interconnection (OSI)'. These results are compiled in the JIS as the basic reference model for the open system interconnection, whereas the targets shown in the basic plan have been achieved sufficiently. (NEDO)

  15. Nasogastric tube placement with video-guided laryngoscope: A manikin simulator study.

    Lee, Xiao-Lun; Yeh, Li-Chun; Jin, Yau-Dung; Chen, Chun-Chih; Lee, Ming-Ho; Huang, Ping-Wun

    2017-08-01

    This study aimed to investigate video-guided laryngoscopy for nasogastric tube placement. This was an observational comparative study performed in a hospital. The participants included volunteers from the medical staff (physicians and nurses) experienced with nasogastric intubation, and non-medical staff (medical students, pharmacists and emergent medical technicians) with knowledge of nasogastric intubation but lacking procedural experience. Medical and non-medical hospital staff performed manual, laryngoscope-assisted and video-guided laryngoscope nasogastric intubation both in the presence and in the absence of an endotracheal tube, using a manikin. Nasogastric intubation times were compared between groups and methods. Using the video-guided laryngoscope resulted in a significantly shorter intubation time compared to the other 2 methods, both with and without an endotracheal tube, for the medical and non-medical staff alike (all p guided laryngoscope without endotracheal intubation, direct laryngoscope with endotracheal intubation and video-guided laryngoscope with endotracheal intubation compared to manual intubation without endotracheal intubation (0.49, 0.63 and 0.72 vs. 5.63, respectively, p ≤ 0.008). For non-medical staff, nasogastric intubation time was significantly shorter using video-guided laryngoscope without endotracheal intubation, direct laryngoscope with endotracheal intubation and video-guided laryngoscope with endotracheal intubation compared to manual intubation without endotracheal intubation (1.67, 1.58 and 0.95 vs. 6.9, respectively, p ≤ 0.002). And mean nasogastric intubation time for video-guided laryngoscope endotracheal intubation was significantly shorter for medical staff than for non-medical staff (0.49 vs. 1.67 min, respectively, p = 0.041). Video-guided laryngoscope reduces nasogastric intubation time compared to manual and direct laryngoscope intubation, which promotes a consistent technique when performed by

  16. Metronome improves compression and ventilation rates during CPR on a manikin in a randomized trial.

    Kern, Karl B; Stickney, Ronald E; Gallison, Leanne; Smith, Robert E

    2010-02-01

    We hypothesized that a unique tock and voice metronome could prevent both suboptimal chest compression rates and hyperventilation. A prospective, randomized, parallel design study involving 34 pairs of paid firefighter/emergency medical technicians (EMTs) performing two-rescuer CPR using a Laerdal SkillReporter Resusci Anne manikin with and without metronome guidance was performed. Each CPR session consisted of 2 min of 30:2 CPR with an unsecured airway, then 4 min of CPR with a secured airway (continuous compressions at 100 min(-1) with 8-10 ventilations/min), repeated after the rescuers switched roles. The metronome provided "tock" prompts for compressions, transition prompts between compressions and ventilations, and a spoken "ventilate" prompt. During CPR with a bag/valve/mask the target compression rate of 90-110 min(-1) was achieved in 5/34 CPR sessions (15%) for the control group and 34/34 sessions (100%) for the metronome group (pmetronome or control group during CPR with a bag/valve/mask. During CPR with a bag/endotracheal tube, the target of both a compression rate of 90-110 min(-1) and a ventilation rate of 8-11 min(-1) was achieved in 3/34 CPR sessions (9%) for the control group and 33/34 sessions (97%) for the metronome group (pMetronome use with the secured airway scenario significantly decreased the incidence of over-ventilation (11/34 EMT pairs vs. 0/34 EMT pairs; pmetronome was effective at directing correct chest compression and ventilation rates both before and after intubation. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  17. Training mothers in infant cardiopulmonary resuscitation with an instructional DVD and manikin.

    Barr, Gavin C; Rupp, Valerie A; Hamilton, Kimberly M; Worrilow, Charles C; Reed, James F; Friel, Kristin S; Dusza, Stephen W; Greenberg, Marna Rayl

    2013-07-01

    Classes in infant cardiopulmonary resuscitation (CPR) can be time consuming and costly. To determine whether mothers in an obstetric unit could learn infant CPR by using a 22-minute instructional kit and to assess the value and confidence they gained by learning CPR. Quasi-experimental study with enrollment between January and December 2008. Obstetric unit in Lehigh Valley Hospital, a suburban teaching hospital in Allentown, Pennsylvania. Mothers at least 18 years old who had given birth within the previous 24 hours. The experimental group included mothers without prior CPR training who watched a 22-minute instructional DVD and practiced on a manikin. The control group included mothers with prior conventional CPR training. In both groups, knowledge and proficiency were assessed with written and practical examinations developed by certified CPR instructors. Participant surveys were conducted at 3 times: immediately before dissemination of course materials, within 24 hours after the mother agreed to participate in the study, and 6 months after initial evaluation. A total of 126 mothers were enrolled in the study: 79 in the experimental group, 25 in the control group, and 22 who withdrew from the study. Written and practical examinations were used to determine proficiency, and composite scores were generated, with a maximum composite score of 12. The composite scores were statistically significantly higher in the experimental group than in the control group, with median scores of 10 and 7, respectively (PCPR training. In the experimental group, 76 mothers (96%) felt more confident as caregivers after learning CPR. Before training in both groups, 84 mothers (81%) stated that learning CPR was extremely important, compared with 100 mothers (96%) after training (P=.001). Use of an instructional kit is an effective method of teaching CPR to new mothers. Mothers reported that learning CPR is extremely important and that it increases their confidence as caregivers.

  18. Decay in chest compression quality due to fatigue is rare during prolonged advanced life support in a manikin model

    Bjørshol Conrad A

    2011-08-01

    Full Text Available Abstract Background The aim of this study was to measure chest compression decay during simulated advanced life support (ALS in a cardiac arrest manikin model. Methods 19 paramedic teams, each consisting of three paramedics, performed ALS for 12 minutes with the same paramedic providing all chest compressions. The patient was a resuscitation manikin found in ventricular fibrillation (VF. The first shock terminated the VF and the patient remained in pulseless electrical activity (PEA throughout the scenario. Average chest compression depth and rate was measured each minute for 12 minutes and divided into three groups based on chest compression quality; good (compression depth ≥ 40 mm, compression rate 100-120/minute for each minute of CPR, bad (initial compression depth 120/minute or decay (change from good to bad during the 12 minutes. Changes in no-flow ratio (NFR, defined as the time without chest compressions divided by the total time of the ALS scenario over time was also measured. Results Based on compression depth, 5 (26%, 9 (47% and 5 (26% were good, bad and with decay, respectively. Only one paramedic experienced decay within the first two minutes. Based on compression rate, 6 (32%, 6 (32% and 7 (37% were good, bad and with decay, respectively. NFR was 22% in both the 1-3 and 4-6 minute periods, respectively, but decreased to 14% in the 7-9 minute period (P = 0.002 and to 10% in the 10-12 minute period (P Conclusions In this simulated cardiac arrest manikin study, only half of the providers achieved guideline recommended compression depth during prolonged ALS. Large inter-individual differences in chest compression quality were already present from the initiation of CPR. Chest compression decay and thereby fatigue within the first two minutes was rare.

  19. Poor chest compression quality with mechanical compressions in simulated cardiopulmonary resuscitation: a randomized, cross-over manikin study.

    Blomberg, Hans; Gedeborg, Rolf; Berglund, Lars; Karlsten, Rolf; Johansson, Jakob

    2011-10-01

    Mechanical chest compression devices are being implemented as an aid in cardiopulmonary resuscitation (CPR), despite lack of evidence of improved outcome. This manikin study evaluates the CPR-performance of ambulance crews, who had a mechanical chest compression device implemented in their routine clinical practice 8 months previously. The objectives were to evaluate time to first defibrillation, no-flow time, and estimate the quality of compressions. The performance of 21 ambulance crews (ambulance nurse and emergency medical technician) with the authorization to perform advanced life support was studied in an experimental, randomized cross-over study in a manikin setup. Each crew performed two identical CPR scenarios, with and without the aid of the mechanical compression device LUCAS. A computerized manikin was used for data sampling. There were no substantial differences in time to first defibrillation or no-flow time until first defibrillation. However, the fraction of adequate compressions in relation to total compressions was remarkably low in LUCAS-CPR (58%) compared to manual CPR (88%) (95% confidence interval for the difference: 13-50%). Only 12 out of the 21 ambulance crews (57%) applied the mandatory stabilization strap on the LUCAS device. The use of a mechanical compression aid was not associated with substantial differences in time to first defibrillation or no-flow time in the early phase of CPR. However, constant but poor chest compressions due to failure in recognizing and correcting a malposition of the device may counteract a potential benefit of mechanical chest compressions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. Placement of the Left Side AED Pad is Poor: Training on the Left Compared to the Right Side of a Manikin Does Not Improve Pad Placement

    Stærk, Mathilde; Bødtker, Henrik; Rahbek, Søren

    2015-01-01

    participating in a first aid course were randomized to learn automated external defibrillation sitting on the left or right side of a manikin during AED training. After course completion participants operated a training AED (Lifepak® CR-T AED Trainer, PhysioControl) and placed AED pads according to instructions...... volunteers were included and randomized to AED training on the left (n=14, 43% male, age: 47.9 years) and right (n=16, 25% male, age: 46.7 years) side of a manikin. There was no difference in left pad placement when trained on the left or right side (distance to recommended left apical pad position (mean...

  1. Industrial and scientific technology research and development project fiscal 1997 commissioned by the New Energy and Industrial Technology Development Organization. Report on research and development of a brain type computer architecture `trial fabrication and high-level evaluation on chips for a large-scale artificial neural system` and on results of evaluation studies on the chips; 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu jigyo Shin energy Sangyo Gijutsu Sogo Kaihatsu Kiko itaku. Nogata computer architecture no kenkyu kaihatsu `daikibo jinko shinkei kairo system yo chip no shisaku to kodo hyoka` chip hyoka kenkyu seika hokokusho

    NONE

    1998-03-01

    In order to develop a brain type computer architecture, a brain mimic processor (BMP) has been developed (which was made in LSI in fiscal 1997), and studies are being made on developing its applications. Development has been made on an element in which linkage of one million cells with one thousand neural cells can be realized. Evaluation substrates mounted with the BMP, control software, and compilers were provided to 15 organizations including the National Research Institutes, universities, corporations, and other societies (for 19 themes) to evaluate capability of the LSI and its application development. All of the research organizations are using these items utilizing features of the LSI or the network for such purposes as learning, storage, recognition and control. The applicable theme may include infrared spectrum pattern recognition and domain division of document images using neural network functions. It can also include structural analysis of mass spectrum molecules, time series pattern recognition, location of corresponding points in images, estimation of moving images, satellite control, character recognition, short time storage, long-term association memory models, and invention process studies, all utilizing the functions of the BMP. 71 refs., 89 figs., 9 tabs.

  2. Thermal comfort sustained by cold protective clothing in Arctic open-pit mining-a thermal manikin and questionnaire study.

    Jussila, Kirsi; Rissanen, Sirkka; Aminoff, Anna; Wahlström, Jens; Vaktskjold, Arild; Talykova, Ljudmila; Remes, Jouko; Mänttäri, Satu; Rintamäki, Hannu

    2017-12-07

    Workers in the Arctic open-pit mines are exposed to harsh weather conditions. Employers are required to provide protective clothing for workers. This can be the outer layer, but sometimes also inner or middle layers are provided. This study aimed to determine how Arctic open-pit miners protect themselves against cold and the sufficiency, and the selection criteria of the garments. Workers' cold experiences and the clothing in four Arctic open-pit mines in Finland, Sweden, Norway and Russia were evaluated by a questionnaire (n=1,323). Basic thermal insulation (I cl ) of the reported clothing was estimated (ISO 9920). The I cl of clothing from the mines were also measured by thermal manikin (standing/walking) in 0.3 and 4.0 m/s wind. The questionnaire showed that the I cl of the selected clothing was on average 1.2 and 1.5 clo in mild (-5 to +5°C) and dry cold (-20 to -10°C) conditions, respectively. The I cl of the clothing measured by thermal manikin was 1.9-2.3 clo. The results show that the Arctic open-pit miners' selected their clothing based on occupational (time outdoors), environmental (temperature, wind, moisture) and individual factors (cold sensitivity, general health). However, the selected clothing was not sufficient to prevent cooling completely at ambient temperatures below -10°C.

  3. Neopuff T-piece resuscitator mask ventilation: Does mask leak vary with different peak inspiratory pressures in a manikin model?

    Maheshwari, Rajesh; Tracy, Mark; Hinder, Murray; Wright, Audrey

    2017-08-01

    The aim of this study was to compare mask leak with three different peak inspiratory pressure (PIP) settings during T-piece resuscitator (TPR; Neopuff) mask ventilation on a neonatal manikin model. Participants were neonatal unit staff members. They were instructed to provide mask ventilation with a TPR with three PIP settings (20, 30, 40 cm H 2 O) chosen in a random order. Each episode was for 2 min with 2-min rest period. Flow rate and positive end-expiratory pressure (PEEP) were kept constant. Airway pressure, inspiratory and expiratory tidal volumes, mask leak, respiratory rate and inspiratory time were recorded. Repeated measures analysis of variance was used for statistical analysis. A total of 12 749 inflations delivered by 40 participants were analysed. There were no statistically significant differences (P > 0.05) in the mask leak with the three PIP settings. No statistically significant differences were seen in respiratory rate and inspiratory time with the three PIP settings. There was a significant rise in PEEP as the PIP increased. Failure to achieve the desired PIP was observed especially at the higher settings. In a neonatal manikin model, the mask leak does not vary as a function of the PIP when the flow rate is constant. With a fixed rate and inspiratory time, there seems to be a rise in PEEP with increasing PIP. © 2017 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).

  4. Mask leak increases and minute ventilation decreases when chest compressions are added to bag ventilation in a neonatal manikin model.

    Tracy, Mark B; Shah, Dharmesh; Hinder, Murray; Klimek, Jan; Marceau, James; Wright, Audrey

    2014-05-01

    To determine changes in respiratory mechanics when chest compressions are added to mask ventilation, as recommended by the International Liaison Committee on Resuscitation (ILCOR) guidelines for newborn infants. Using a Laerdal Advanced Life Support leak-free baby manikin and a 240-mL self-inflating bag, 58 neonatal staff members were randomly paired to provide mask ventilation, followed by mask ventilation with chest compressions with a 1:3 ratio, for two minutes each. A Florian respiratory function monitor was used to measure respiratory mechanics, including mask leak. The addition of chest compressions to mask ventilation led to a significant reduction in inflation rate, from 63.9 to 32.9 breaths per minute (p mask leak of 6.8% (p mask ventilation, in accordance with the ILCOR guidelines, in a manikin model is associated with a significant reduction in delivered ventilation and increase in mask leak. If similar findings occur in human infants needing an escalation in resuscitation, there is a potential risk of either delay in recovery or inadequate response to resuscitation. ©2014 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  5. Segmental equivalent temperature determined by means of a thermal manikin: A method for correcting errors due to incomplete contact of the body with a surface

    Melikov, Arsen Krikor; Janieas, N.R.D.J.; Silva, M.C.G.

    2004-01-01

    of the thermal manikins used at present is not as flexible as the human body and is divided into body segments with a surface area that differs from that of the human body in contact with a surface. The area of the segment in contact with a surface will depend on the shape and flexibility of the surface...

  6. Acquisition and retention of basic life support skills in an untrained population using a personal resuscitation manikin and video self-instruction (VSI)

    Nielsen, Anne Møller; Henriksen, Mikael Johannes Vuokko; Isbye, Dan Lou

    2010-01-01

    Video-based self-instruction (VSI) with a 24-min DVD and a personal resuscitation manikin solves some of the barriers associated with traditional basic life support (BLS) courses. No accurate assessment of the actual improvement in skills after attending a VSI course has been determined...

  7. The Impact of Learning Style on Healthcare Providers' Preference for Voice Advisory Manikins versus Live Instructors in Basic Life Support Training

    DiGiovanni, Lisa Marie

    2013-01-01

    The American Heart Association's HeartCode[TM] Healthcare Provider (HCP) Basic Life Support (BLS) e-learning program with voice-advisory manikins was implemented in an acute care hospital as the only teaching method offered for BLS certification. On course evaluations, healthcare provider staff commented that the VAM technology for skills practice…

  8. An appropriate compression pace is important for securing the quality of hands-only CPR--a manikin study.

    Shimizu, Yoshitaka; Tanigawa, Koichi; Ishikawa, Masami; Ouhara, Kazuhisa; Oue, Kana; Yoshinaka, Taiga; Kurihara, Hidemi; Irifune, Masahiro

    2014-09-01

    It is important to implement good quality chest compressions for cardiopulmonary resuscitation (CPR). This manikin study examined the effects of different compression rates on chest compression depth variables using a metronome sound guide. Fifty sixth-year dentistry students participated in the study. Each participant performed CPR at 3 different compression rates, 110, 100, and 90 compressions per min (pace-110-g, pace-100-g, and pace-90-g) for 2 consecutive one-minute sets with a ten-second break between the sets. The percentage of compressions deeper than 5 cm at pace-110-g decreased significantly from 22.1 ± 4.7% in the first set to 16.7 ± 4.4%* in the second set (*p CPR.

  9. C-MAC compared with direct laryngoscopy for intubation in patients with cervical spine immobilization: A manikin trial.

    Smereka, Jacek; Ladny, Jerzy R; Naylor, Amanda; Ruetzler, Kurt; Szarpak, Lukasz

    2017-08-01

    The aim of this study was to compare C-MAC videolaryngoscopy with direct laryngoscopy for intubation in simulated cervical spine immobilization conditions. The study was designed as a prospective randomized crossover manikin trial. 70 paramedics with immobilization (Scenario A); manual inline cervical immobilization (Scenario B); cervical immobilization using cervical extraction collar (Scenario C). Scenario A: Nearly all participants performed successful intubations with both MAC and C-MAC on the first attempt (95.7% MAC vs. 100% C-MAC), with similar intubation times (16.5s MAC vs. 18s C-MAC). Scenario B: The results with C-MAC were significantly better than those with MAC (pimmobilization. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Assessment of body mapping sportswear using a manikin operated in constant temperature mode and thermoregulatory model control mode

    Wang, Faming; Del Ferraro, Simona; Molinaro, Vincenzo; Morrissey, Matthew; Rossi, René

    2014-09-01

    Regional sweating patterns and body surface temperature differences exist between genders. Traditional sportswear made from one material and/or one fabric structure has a limited ability to provide athletes sufficient local wear comfort. Body mapping sportswear consists of one piece of multiple knit structure fabric or of different fabric pieces that may provide athletes better wear comfort. In this study, the `modular' body mapping sportswear was designed and subsequently assessed on a `Newton' type sweating manikin that operated in both constant temperature mode and thermophysiological model control mode. The performance of the modular body mapping sportswear kit and commercial products were also compared. The results demonstrated that such a modular body mapping sportswear kit can meet multiple wear/thermal comfort requirements in various environmental conditions. All body mapping clothing (BMC) presented limited global thermophysiological benefits for the wearers. Nevertheless, BMC showed evident improvements in adjusting local body heat exchanges and local thermal sensations.

  11. Effect of socioemotional stress on the quality of cardiopulmonary resuscitation during advanced life support in a randomized manikin study.

    Bjørshol, Conrad Arnfinn; Myklebust, Helge; Nilsen, Kjetil Lønne; Hoff, Thomas; Bjørkli, Cato; Illguth, Eirik; Søreide, Eldar; Sunde, Kjetil

    2011-02-01

    The aim of this study was to evaluate whether socioemotional stress affects the quality of cardiopulmonary resuscitation during advanced life support in a simulated manikin model. A randomized crossover trial with advanced life support performed in two different conditions, with and without exposure to socioemotional stress. The study was conducted at the Stavanger Acute Medicine Foundation for Education and Research simulation center, Stavanger, Norway. Paramedic teams, each consisting of two paramedics and one assistant, employed at Stavanger University Hospital, Stavanger, Norway. A total of 19 paramedic teams performed advanced life support twice in a randomized fashion, one control condition without socioemotional stress and one experimental condition with exposure to socioemotional stress. The socioemotional stress consisted of an upset friend of the simulated patient who was a physician, spoke a foreign language, was unfamiliar with current Norwegian resuscitation guidelines, supplied irrelevant clinical information, and repeatedly made doubts about the paramedics' resuscitation efforts. Aural distractions were supplied by television and cell telephone. The primary outcome was the quality of cardiopulmonary resuscitation: chest compression depth, chest compression rate, time without chest compressions (no-flow ratio), and ventilation rate after endotracheal intubation. As a secondary outcome, the socioemotional stress impact was evaluated through the paramedics' subjective workload, frustration, and feeling of realism. There were no significant differences in chest compression depth (39 vs. 38 mm, p = .214), compression rate (113 vs. 116 min⁻¹, p = .065), no-flow ratio (0.15 vs. 0.15, p = .618), or ventilation rate (8.2 vs. 7.7 min⁻¹, p = .120) between the two conditions. There was a significant increase in the subjective workload, frustration, and feeling of realism when the paramedics were exposed to socioemotional stress. In this advanced life

  12. The effects of non-invasive respiratory support on oropharyngeal temperature and humidity: a neonatal manikin study.

    Roberts, Calum T; Kortekaas, Rebecca; Dawson, Jennifer A; Manley, Brett J; Owen, Louise S; Davis, Peter G

    2016-05-01

    Heating and humidification of inspired gases is routine during neonatal non-invasive respiratory support. However, little is known about the temperature and humidity delivered to the upper airway. The International Standards Organization (ISO) specifies that for all patients with an artificial airway humidifiers should deliver ≥33 g/m(3) absolute humidity (AH). We assessed the oropharyngeal temperature and humidity during different non-invasive support modes in a neonatal manikin study. Six different modes of non-invasive respiratory support were applied at clinically relevant settings to a neonatal manikin, placed in a warmed and humidified neonatal incubator. Oropharyngeal temperature and relative humidity (RH) were assessed using a thermohygrometer. AH was subsequently calculated. Measured temperature and RH varied between devices. Bubble and ventilator continuous positive airway pressure (CPAP) produced temperatures >34°C and AH >38 g/m(3). Variable flow CPAP resulted in lower levels of AH than bubble or ventilator CPAP, and AH decreased with higher gas flow. High-flow (HF) therapy delivered by Optiflow Junior produced higher AH with higher gas flow, whereas with Vapotherm HF the converse was true. Different non-invasive devices deliver inspiratory gases of variable temperature and humidity. Most AH levels were above the ISO recommendation; however, with some HF and variable flow CPAP devices at higher gas flow this was not achieved. Clinicians should be aware of differences in the efficacy of heating and humidification when choosing modes of non-invasive respiratory support. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. COMPUTING

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. COMPUTING

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  15. COMPUTING

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. COMPUTING

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  17. Does a 4 diagram manual enable laypersons to operate the laryngeal mask supreme®? A pilot study in the manikin

    Schälte Gereon

    2012-03-01

    Full Text Available Abstract Background Bystander resuscitation plays an important role in lifesaving cardiopulmonary resuscitation (CPR. A significant reduction in the "no-flow-time", quantitatively better chest compressions and an improved quality of ventilation can be demonstrated during CPR using supraglottic airway devices (SADs. Previous studies have demonstrated the ability of inexperienced persons to operate SADs after brief instruction. The aim of this pilot study was to determine whether an instruction manual consisting of four diagrams enables laypersons to operate a Laryngeal Mask Supreme® (LMAS in the manikin. Methods An instruction manual of four illustrations with speech bubbles displaying the correct use of the LMAS was designed. Laypersons were handed a bag containing a LMAS, a bag mask valve device (BMV, a syringe prefilled with air and the instruction sheet, and were asked to perform and ventilate the manikin as displayed. Time to ventilation was recorded and degree of success evaluated. Results A total of 150 laypersons took part. Overall 145 participants (96.7% inserted the LMAS in the manikin in the right direction. The device was inserted inverted or twisted in 13 (8.7% attempts. Eight (5.3% individuals recognized this and corrected the position. Within the first 2 minutes 119 (79.3% applicants were able to insert the LMAS and provide tidal volumes greater than 150 ml (estimated dead space. Time to insertion and first ventilation was 83.2 ± 29 s. No significant difference related to previous BLS training (P = 0.85, technical education (P = 0.07 or gender could be demonstrated (P = 0.25. Conclusion In manikin laypersons could insert LMAS in the correct direction after onsite instruction by a simple manual with a high success rate. This indicates some basic procedural understanding and intellectual transfer in principle. Operating errors (n = 91 were frequently not recognized and corrected (n = 77. Improvements in labeling and the quality of

  18. COMPUTING

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. COMPUTING

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  20. COMPUTING

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  1. Quality of closed chest compression on a manikin in ambulance vehicles and flying helicopters with a real time automated feedback.

    Havel, Christof; Schreiber, Wolfgang; Trimmel, Helmut; Malzer, Reinhard; Haugk, Moritz; Richling, Nina; Riedmüller, Eva; Sterz, Fritz; Herkner, Harald

    2010-01-01

    Automated verbal and visual feedback improves quality of resuscitation in out-of-hospital cardiac arrest and was proven to increase short-term survival. Quality of resuscitation may be hampered in more difficult situations like emergency transportation. Currently there is no evidence if feedback devices can improve resuscitation quality during different modes of transportation. To assess the effect of real time automated feedback on the quality of resuscitation in an emergency transportation setting. Randomised cross-over trial. Medical University of Vienna, Vienna Municipal Ambulance Service and Helicopter Emergency Medical Service Unit (Christophorus Flugrettungsverein) in September 2007. European Resuscitation Council (ERC) certified health care professionals performing CPR in a flying helicopter and in a moving ambulance vehicle on a manikin with human-like chest properties. CPR sessions, with real time automated feedback as the intervention and standard CPR without feedback as control. Quality of chest compression during resuscitation. Feedback resulted in less deviation from ideal compression rate 100 min(-1) (9+/-9 min(-1), ptime. Applied work was less in the feedback group compared to controls (373+/-448 cm x compression; ptime automated feedback improves certain aspects of CPR quality in flying helicopters and moving ambulance vehicles. The effect of feedback guidance was most pronounced for chest compression rate. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  2. Comparison of blind intubation through the I-gel and ILMA Fastrach by nurses during cardiopulmonary resuscitation: a manikin study.

    Melissopoulou, Theodora; Stroumpoulis, Konstantinos; Sampanis, Michail A; Vrachnis, Nikolaos; Papadopoulos, Georgios; Chalkias, Athanasios; Xanthos, Theodoros

    2014-01-01

    To investigate whether nursing staff can successfully use the I-gel and the intubating laryngeal mask Fastrach (ILMA) during cardiopulmonary resuscitation. Although tracheal intubation is considered to be the optimal method for securing the airway during cardiopulmonary resuscitation, laryngoscopy requires a high level of skill. Forty five nurses inserted the I-gel and the ILMA in a manikin, with continuous and without chest compressions. Mean intubation times for the ILMA and I-gel without chest compressions were 20.60 ± 3.27 and 18.40 ± 3.26 s, respectively (p < 0.0005). ILMA proved more successful than the I-gel regardless of compressions. Continuation of compressions caused a prolongation in intubation times for both the I-gel (p < 0.0005) and the ILMA (p < 0.0005). In this mannequin study, nursing staff can successfully intubate using the I-gel and the ILMA as conduits with comparable success rates, regardless of whether chest compressions are interrupted or not. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Methods of evaluating protective clothing relative to heat and cold stress: thermal manikin, biomedical modeling, and human testing.

    O'Brien, Catherine; Blanchard, Laurie A; Cadarette, Bruce S; Endrusick, Thomas L; Xu, Xiaojiang; Berglund, Larry G; Sawka, Michael N; Hoyt, Reed W

    2011-10-01

    Personal protective equipment (PPE) refers to clothing and equipment designed to protect individuals from chemical, biological, radiological, nuclear, and explosive hazards. The materials used to provide this protection may exacerbate thermal strain by limiting heat and water vapor transfer. Any new PPE must therefore be evaluated to ensure that it poses no greater thermal strain than the current standard for the same level of hazard protection. This review describes how such evaluations are typically conducted. Comprehensive evaluation of PPE begins with a biophysical assessment of materials using a guarded hot plate to determine the thermal characteristics (thermal resistance and water vapor permeability). These characteristics are then evaluated on a thermal manikin wearing the PPE, since thermal properties may change once the materials have been constructed into a garment. These data may be used in biomedical models to predict thermal strain under a variety of environmental and work conditions. When the biophysical data indicate that the evaporative resistance (ratio of permeability to insulation) is significantly better than the current standard, the PPE is evaluated through human testing in controlled laboratory conditions appropriate for the conditions under which the PPE would be used if fielded. Data from each phase of PPE evaluation are used in predictive models to determine user guidelines, such as maximal work time, work/rest cycles, and fluid intake requirements. By considering thermal stress early in the development process, health hazards related to temperature extremes can be mitigated while maintaining or improving the effectiveness of the PPE for protection from external hazards.

  4. Effect of one-rescuer compression/ventilation ratios on cardiopulmonary resuscitation in infant, pediatric, and adult manikins.

    Srikantan, Shoba Krishnan; Berg, Robert A; Cox, Tim; Tice, Lisa; Nadkarni, Vinay M

    2005-05-01

    Optimal chest compression to ventilation ratio (C:V) for one-rescuer cardiopulmonary resuscitation (CPR) is not known, with current American Heart Association recommendations 3:1 for newborns, 5:1 for children, and 15:2 for adults. C:V ratios influence effectiveness of CPR, but memorizing different ratios is educationally cumbersome. We hypothesized that a 10:2 ratio might provide adequate universal application for all age arrest victims. Clinical study. Tertiary care children's hospital. Thirty-five health care providers. Thirty-five health care providers performed 5-min epochs of one-rescuer CPR at C:V ratios of 3:1, 5:1, 10:2, and 15:2 in random order on infant, pediatric, and adult manikins. Compressions were paced at 100/min by metronome. The number of effective compressions and ventilations delivered per minute was recorded by a trained basic life support instructor. Subjective assessments of fatigue (self-report) and exertion (change in rescuer pulse rate compared with baseline) were assessed. Analysis was by repeated measures analysis of variance and paired Student's t-test. Effective infant compressions per minute did not differ by C:V ratio, but ventilations per minute were greater at 3:1 vs. 5:1, 10:2, and 15:2 (p 15:2 (p educational value and technique retention.

  5. Influence of mask type and mask position on the effectiveness of bag-mask ventilation in a neonatal manikin.

    Deindl, Philipp; O'Reilly, Megan; Zoller, Katharina; Berger, Angelika; Pollak, Arnold; Schwindt, Jens; Schmölzer, Georg M

    2014-01-01

    Anatomical face mask with an air cushion rim might be placed accidentally in a false orientation on the newborn's face or filled with various amounts of air during neonatal resuscitation. Both false orientation as well as variable filling may reduce a tight seal and therefore hamper effective positive pressure ventilation (PPV). We aimed to measure the influence of mask type and mask position on the effectiveness of PPV. Twenty neonatal staff members delivered PPV to a modified, leak-free manikin. Resuscitation parameters were recorded using a self-inflatable bag PPV with an Intersurgical anatomical air cushion rim face mask (IS) and a size 0/1 Laerdal round face mask. Three different positions of the IS were tested: correct position, 90° and 180° rotation in reference to the midline of the face. IS masks in each correct position on the face but with different inflation of the air cushion (empty, 10, 20 and 30 mL). Mask leak was similar with mask rotation to either 90° or 180° but significantly increased from 27 (13-73) % with an adequate filled IS mask compared to 52 (16-83) % with an emptied air cushion rim. Anatomical-shaped face mask had similar mask leaks compared to round face mask. A wrongly positioned anatomical-shaped mask does not influence mask leak. Mask leak significantly increased once the air cushion rim was empty, which may cause failure in mask PPV.

  6. Results from Carbon Dioxide Washout Testing Using a Suited Manikin Test Apparatus with a Space Suit Ventilation Test Loop

    Chullen, Cinda; Conger, Bruce; McMillin, Summer; Vonau, Walt; Kanne, Bryan; Korona, Adam; Swickrath, Mike

    2016-01-01

    NASA is developing an advanced portable life support system (PLSS) to meet the needs of a new NASA advanced space suit. The PLSS is one of the most critical aspects of the space suit providing the necessary oxygen, ventilation, and thermal protection for an astronaut performing a spacewalk. The ventilation subsystem in the PLSS must provide sufficient carbon dioxide (CO2) removal and ensure that the CO2 is washed away from the oronasal region of the astronaut. CO2 washout is a term used to describe the mechanism by which CO2 levels are controlled within the helmet to limit the concentration of CO2 inhaled by the astronaut. Accumulation of CO2 in the helmet or throughout the ventilation loop could cause the suited astronaut to experience hypercapnia (excessive carbon dioxide in the blood). A suited manikin test apparatus (SMTA) integrated with a space suit ventilation test loop was designed, developed, and assembled at NASA in order to experimentally validate adequate CO2 removal throughout the PLSS ventilation subsystem and to quantify CO2 washout performance under various conditions. The test results from this integrated system will be used to validate analytical models and augment human testing. This paper presents the system integration of the PLSS ventilation test loop with the SMTA including the newly developed regenerative Rapid Cycle Amine component used for CO2 removal and tidal breathing capability to emulate the human. The testing and analytical results of the integrated system are presented along with future work.

  7. A Method for Teaching the Modeling of Manikins Suitable for Third-Person 3-D Virtual Worlds and Games

    Nick V. Flor

    2012-08-01

    Full Text Available Virtual Worlds have the potential to transform the way people learn, work, and play. With the emerging fields of service science and design science, professors and students at universities are in a unique position to lead the research and development of innovative and value-adding virtual worlds. However, a key barrier in the development of virtual worlds—especially for business, technical, and non-artistic students—is the ability to model human figures in 3-D for use as avatars and automated characters in virtual worlds. There are no articles in either research or teaching journals which describe methods that non-artists can use to create 3-D human figures. This paper presents a repeatable and flexible method I have taught successfully to both artists and business students, which allows them to quickly model human-like figures (manikins that are sufficient for prototype purposes and that allows students and researchers alike to explore the development of new kinds of virtual worlds.

  8. Real-time feedback can improve infant manikin cardiopulmonary resuscitation by up to 79%--a randomised controlled trial.

    Martin, Philip; Theobald, Peter; Kemp, Alison; Maguire, Sabine; Maconochie, Ian; Jones, Michael

    2013-08-01

    European and Advanced Paediatric Life Support training courses. Sixty-nine certified CPR providers. CPR providers were randomly allocated to a 'no-feedback' or 'feedback' group, performing two-thumb and two-finger chest compressions on a "physiological", instrumented resuscitation manikin. Baseline data was recorded without feedback, before chest compressions were repeated with one group receiving feedback. Indices were calculated that defined chest compression quality, based upon comparison of the chest wall displacement to the targets of four, internationally recommended parameters: chest compression depth, release force, chest compression rate and compression duty cycle. Baseline data were consistent with other studies, with <1% of chest compressions performed by providers simultaneously achieving the target of the four internationally recommended parameters. During the 'experimental' phase, 34 CPR providers benefitted from the provision of 'real-time' feedback which, on analysis, coincided with a statistical improvement in compression rate, depth and duty cycle quality across both compression techniques (all measures: p<0.001). Feedback enabled providers to simultaneously achieve the four targets in 75% (two-finger) and 80% (two-thumb) of chest compressions. Real-time feedback produced a dramatic increase in the quality of chest compression (i.e. from <1% to 75-80%). If these results transfer to a clinical scenario this technology could, for the first time, support providers in consistently performing accurate chest compressions during infant CPR and thus potentially improving clinical outcomes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Laypersons can successfully place supraglottic airways with 3 minutes of training. A comparison of four different devices in the manikin

    Schälte Gereon

    2011-10-01

    Full Text Available Abstract Introduction Supraglottic airway devices have frequently been shown to facilitate airway management and are implemented in the ILCOR resuscitation algorithm. Limited data exists concerning laypersons without any medical or paramedical background. We hypothesized that even laymen would be able to operate supraglottic airway devices after a brief training session. Methods Four different supraglottic airway devices: Laryngeal Mask Classic (LMA, Laryngeal Tube (LT, Intubating Laryngeal Mask (FT and CobraPLA (Cobra were tested in 141 volunteers recruited in a technical university cafeteria and in a shopping mall. All volunteers received a brief standardized training session. Primary endpoint was the time required to definitive insertion. In a short questionnaire applicants were asked to assess the devices and to answer some general questions about BLS. Results The longest time to insertion was observed for Cobra (31.9 ± 27.9 s, range: 9-120, p 0.05, the LT (1.36 ± 0.61, p Conclusion Laypersons are able to operate supraglottic airway devices in manikin with minimal instruction. Ventilation was achieved with all devices tested after a reasonable time and with a high success rate of > 95%. The use of supraglottic airway devices in first aid and BLS algorithms should be considered.

  10. In vitro comparison in a manikin model: increasing apical enlargement with K3 and K3XF rotary instruments.

    Olivieri, Juan Gonzalo; Stöber, Eva; García Font, Marc; González, Jose Antonio; Bragado, Pablo; Roig, Miguel; Duran-Sindreu, Fernando

    2014-09-01

    The aim of the study was to compare the K3 and K3XF systems (SybronEndo, Glendora, CA) after 1 and 2 uses by evaluating apical transportation, working length loss, and working time in a manikin model. Mesial canals of 40 extracted first mandibular molars were instrumented. Radiographs taken after instrumentation with #25, #30, #35, and #40 files were superimposed on the preoperative image in both mesiodistal and buccolingual angulations. AutoCAD (Autodesk Inc, San Rafael, CA) was used to measure working length loss and apical transportation at 0, 0.5, and 1 mm from the working length (WL). The working time was measured. Group comparison was analyzed using post hoc Tukey honestly significant difference tests (P < .05). No significant differences were found in apical transportation, working length loss between K3 and K3XF systems, or between the number of uses. Significant differences were found when canal enlargement was performed to a #35-40 (P < .05). K3 instrumentation performed significantly faster (29.6 ± 15.4) than with the K3XF system (40.2 ± 17.7) (P < .05). No differences were observed in working time when comparing the number of uses. K3 and R-phase K3XF rotary systems shaped curved root canals safely with minimal apical transportation, even up to a 40/04 file. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. COMPUTING

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  12. COMPUTING

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  13. COMPUTING

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  14. COMPUTING

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  15. COMPUTING

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  16. COMPUTING

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  17. COMPUTING

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  18. COMPUTING

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  19. COMPUTING

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  20. COMPUTING

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  1. COMPUTING

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  3. COMPUTING

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. COMPUTING

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  5. COMPUTING

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  6. COMPUTING

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  7. The effect of a standardised source of divided attention in airway management: A randomised, crossover, interventional manikin study.

    Prottengeier, Johannes; Petzoldt, Marlen; Jess, Nikola; Moritz, Andreas; Gall, Christine; Schmidt, Joachim; Breuer, Georg

    2016-03-01

    Dual-tasking, the need to divide attention between concurrent tasks, causes a severe increase in workload in emergency situations and yet there is no standardised training simulation scenario for this key difficulty. We introduced and validated a quantifiable source of divided attention and investigated its effects on performance and workload in airway management. A randomised, crossover, interventional simulation study. Center for Training and Simulation, Department of Anaesthesiology, Erlangen University Hospital, Germany. One hundred and fifty volunteer medical students, paramedics and anaesthesiologists of all levels of training. Participants secured the airway of a manikin using a supraglottic airway, conventional endotracheal intubation and video-assisted endotracheal intubation with and without the Paced Auditory Serial Addition Test (PASAT), which served as a quantifiable source of divided attention. Primary endpoint was the time for the completion of each airway task. Secondary endpoints were the number of procedural mistakes made and the perceived workload as measured by the National Aeronautics and Space Administration's task load index (NASA-TLX). This is a six-dimensional questionnaire, which assesses the perception of demands, performance and frustration with respect to a task on a scale of 0 to 100. All 150 participants completed the tests. Volunteers perceived our test to be challenging (99%) and the experience of stress and distraction true to an emergency situation (80%), but still fair (98%) and entertaining (95%). The negative effects of divided attention were reproducible in participants of all levels of expertise. Time consumption and perceived workload increased and almost half the participants make procedural mistakes under divided attention. The supraglottic airway technique was least affected by divided attention. The scenario was effective for simulation training involving divided attention in acute care medicine. The significant effects

  8. Protocol of a Multicenter International Randomized Controlled Manikin Study on Different Protocols of Cardiopulmonary Resuscitation for laypeople (MANI-CPR).

    Baldi, Enrico; Contri, Enrico; Burkart, Roman; Borrelli, Paola; Ferraro, Ottavia Eleonora; Tonani, Michela; Cutuli, Amedeo; Bertaia, Daniele; Iozzo, Pasquale; Tinguely, Caroline; Lopez, Daniel; Boldarin, Susi; Deiuri, Claudio; Dénéréaz, Sandrine; Dénéréaz, Yves; Terrapon, Michael; Tami, Christian; Cereda, Cinzia; Somaschini, Alberto; Cornara, Stefano; Cortegiani, Andrea

    2018-04-19

    Out-of-hospital cardiac arrest is one of the leading causes of death in industrialised countries. Survival depends on prompt identification of cardiac arrest and on the quality and timing of cardiopulmonary resuscitation (CPR) and defibrillation. For laypeople, there has been a growing interest on hands-only CPR, meaning continuous chest compression without interruption to perform ventilations. It has been demonstrated that intentional interruptions in hands-only CPR can increase its quality. The aim of this randomised trial is to compare three CPR protocols performed with different intentional interruptions with hands-only CPR. This is a prospective randomised trial performed in eight training centres. Laypeople who passed a basic life support course will be randomised to one of the four CPR protocols in an 8 min simulated cardiac arrest scenario on a manikin: (1) 30 compressions and 2 s pause; (2) 50 compressions and 5 s pause; (3) 100 compressions and 10 s pause; (4) hands-only. The calculated sample size is 552 people. The primary outcome is the percentage of chest compression performed with correct depth evaluated by a computerised feedback system (Laerdal QCPR). ETHICS AND DISSEMINATION: . Due to the nature of the study, we obtained a waiver from the Ethics Committee (IRCCS Policlinico San Matteo, Pavia, Italy). All participants will sign an informed consent form before randomisation. The results of this study will be published in peer-reviewed journal. The data collected will also be made available in a public data repository. NCT02632500. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Performance of cardiopulmonary resuscitation during prolonged basic life support in military medical university students: A manikin study

    Wang, Juan; Zhuo, Chao-nan; Zhang, Lei; Gong, Yu-shun; Yin, Chang-lin; Li, Yong-qin

    2015-01-01

    BACKGROUND: The quality of chest compressions can be significantly improved after training of rescuers according to the latest national guidelines of China. However, rescuers may be unable to maintain adequate compression or ventilation throughout a response of average emergency medical services because of increased rescuer fatigue. In the present study, we evaluated the performance of cardiopulmonary resuscitation (CPR) in training of military medical university students during a prolonged basic life support (BLS). METHODS: A 3-hour BLS training was given to 120 military medical university students. Six months after the training, 115 students performed single rescuer BLS on a manikin for 8 minutes. The qualities of chest compressions as well as ventilations were assessed. RESULTS: The average compression depth and rate were 53.7±5.3 mm and 135.1±15.7 compressions per minute respectively. The proportion of chest compressions with appropriate depth was 71.7%±28.4%. The average ventilation volume was 847.2±260.4 mL and the proportion of students with adequate ventilation was 63.5%. Compared with male students, significantly lower compression depth (46.7±4.8 vs. 54.6±4.8 mm, PCPR was found to be related to gender, body weight, and body mass index of students in this study. The quality of chest compressions was well maintained in male students during 8 minutes of conventional CPR but declined rapidly in female students after 2 minutes according to the latest national guidelines. Physical fitness and rescuer fatigue did not affect the quality of ventilation. PMID:26401177

  10. Performance of cardiopulmonary resuscitation during prolonged basic life support in military medical university students: A manikin study.

    Wang, Juan; Zhuo, Chao-Nan; Zhang, Lei; Gong, Yu-Shun; Yin, Chang-Lin; Li, Yong-Qin

    2015-01-01

    The quality of chest compressions can be significantly improved after training of rescuers according to the latest national guidelines of China. However, rescuers may be unable to maintain adequate compression or ventilation throughout a response of average emergency medical services because of increased rescuer fatigue. In the present study, we evaluated the performance of cardiopulmonary resuscitation (CPR) in training of military medical university students during a prolonged basic life support (BLS). A 3-hour BLS training was given to 120 military medical university students. Six months after the training, 115 students performed single rescuer BLS on a manikin for 8 minutes. The qualities of chest compressions as well as ventilations were assessed. The average compression depth and rate were 53.7±5.3 mm and 135.1±15.7 compressions per minute respectively. The proportion of chest compressions with appropriate depth was 71.7%±28.4%. The average ventilation volume was 847.2±260.4 mL and the proportion of students with adequate ventilation was 63.5%. Compared with male students, significantly lower compression depth (46.7±4.8 vs. 54.6±4.8 mm, P<0.001) and adequate compression rate (35.5%±26.5% vs. 76.1%±25.1%, P<0.001) were observed in female students. CPR was found to be related to gender, body weight, and body mass index of students in this study. The quality of chest compressions was well maintained in male students during 8 minutes of conventional CPR but declined rapidly in female students after 2 minutes according to the latest national guidelines. Physical fitness and rescuer fatigue did not affect the quality of ventilation.

  11. Barriers and enablers to the use of high-fidelity patient simulation manikins in nurse education: an integrative review.

    Al-Ghareeb, Amal Z; Cooper, Simon J

    2016-01-01

    This integrative review identified, critically appraised and synthesised the existing evidence on the barriers and enablers to using high-fidelity human patient simulator manikins (HPSMs) in undergraduate nursing education. In nursing education, specifically at the undergraduate level, a range of low to high-fidelity simulations have been used as teaching aids. However, nursing educators encounter challenges when introducing new teaching methods or technology, despite the prevalence of high-fidelity HPSMs in nursing education. An integrative review adapted a systematic approach. Medline, CINAHL plus, ERIC, PsychINFO, EMBASE, SCOPUS, Science Direct, Cochrane database, Joanna Brigge Institute, ProQuest, California Simulation Alliance, Simulation Innovative Recourses Center and the search engine Google Scholar were searched. Keywords were selected and specific inclusion/exclusion criteria were applied. The review included all research designs for papers published between 2000 and 2015 that identified the barriers and enablers to using high-fidelity HPSMs in undergraduate nursing education. Studies were appraised using the Critical Appraisal Skills Programme criteria. Thematic analysis was undertaken and emergent themes were extracted. Twenty-one studies were included in the review. These studies adopted quasi-experimental, prospective non-experimental and descriptive designs. Ten barriers were identified, including "lack of time," "fear of technology" and "workload issues." Seven enablers were identified, including "faculty training," "administrative support" and a "dedicated simulation coordinator." Barriers to simulation relate specifically to the complex technologies inherent in high-fidelity HPSMs approaches. Strategic approaches that support up-skilling and provide dedicated technological support may overcome these barriers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Does real-time objective feedback and competition improve performance and quality in manikin CPR training--a prospective observational study from several European EMS.

    Smart, J R; Kranz, K; Carmona, F; Lindner, T W; Newton, A

    2015-10-15

    Previous studies have reported that the quality of cardiopulmonary resuscitation (CPR) is important for patient survival. Real time objective feedback during manikin training has been shown to improve CPR performance. Objective measurement could facilitate competition and help motivate participants to improve their CPR performance. The aims of this study were to investigate whether real time objective feedback on manikins helps improve CPR performance and whether competition between separate European Emergency Medical Services (EMS) and between participants at each EMS helps motivation to train. Ten European EMS took part in the study and was carried out in two stages. At Stage 1, each EMS provided 20 pre-hospital professionals. A questionnaire was completed and standardised assessment scenarios were performed for adult and infant out of hospital cardiac arrest (OHCA). CPR performance was objectively measured and recorded but no feedback given. Between Stage 1 and 2, each EMS was given access to manikins for 6 months and instructed on how to use with objective real-time CPR feedback available. Stage 2 was undertaken and was a repeat of Stage 1 with a questionnaire with additional questions relating to usefulness of feedback and the competition nature of the study (using a 10 point Likert score). The EMS that improved the most from Stage 1 to Stage 2 was declared the winner. An independent samples Student t-test was used to analyse the objective CPR metrics with the significance level taken as p Competition between EMS organisations recorded a mean score of 5.8 and competition between participants recorded a mean score of 6.0. The results suggest that the use of real time objective feedback can significantly help improve CPR performance. Competition, especially between participants, appeared to encourage staff to practice and this study suggests that competition might have a useful role to help motivate staff to perform CPR training.

  13. A suitability study of the fission product phantom and the bottle manikin absorption phantom for calibration of in vivo bioassay equipment for the DOELAP accreditation testing program

    Olsen, P.C.; Lynch, T.P.

    1991-08-01

    Pacific Northwest laboratory (PNL) conducted an intercomparison study of the Fission Product phantom and the bottle manikin absorption (BOMAB) phantom for the US Department of Energy (DOE) to determine the consistency of calibration response of the two phantoms and their suitability for certification and use under a planned bioassay laboratory accreditation program. The study was initiated to determine calibration factors for both types of phantoms and to evaluate the suitability of their use in DOE Laboratory Accreditation Program (DOELAP) round-robin testing. The BOMAB was found to be more appropriate for the DOELAP testing program. 9 refs., 9 figs., 9 tabs

  14. Evaluation of a novel noninvasive continuous core temperature measurement system with a zero heat flux sensor using a manikin of the human body.

    Brandes, Ivo F; Perl, Thorsten; Bauer, Martin; Bräuer, Anselm

    2015-02-01

    Reliable continuous perioperative core temperature measurement is of major importance. The pulmonary artery catheter is currently the gold standard for measuring core temperature but is invasive and expensive. Using a manikin, we evaluated the new, noninvasive SpotOn™ temperature monitoring system (SOT). With a sensor placed on the lateral forehead, SOT uses zero heat flux technology to noninvasively measure core temperature; and because the forehead is devoid of thermoregulatory arteriovenous shunts, a piece of bone cement served as a model of the frontal bone in this study. Bias, limits of agreements, long-term measurement stability, and the lowest measurable temperature of the device were investigated. Bias and limits of agreement of the temperature data of two SOTs and of the thermistor placed on the manikin's surface were calculated. Measurements obtained from SOTs were similar to thermistor values. The bias and limits of agreement lay within a predefined clinically acceptable range. Repeat measurements differed only slightly, and stayed stable for hours. Because of its temperature range, the SOT cannot be used to monitor temperatures below 28°C. In conclusion, the new SOT could provide a reliable, less invasive and cheaper alternative for measuring perioperative core temperature in routine clinical practice. Further clinical trials are needed to evaluate these results.

  15. Optimal Chest Compression Rate and Compression to Ventilation Ratio in Delivery Room Resuscitation: Evidence from Newborn Piglets and Neonatal Manikins

    Solevåg, Anne Lee; Schmölzer, Georg M.

    2017-01-01

    Cardiopulmonary resuscitation (CPR) duration until return of spontaneous circulation (ROSC) influences survival and neurologic outcomes after delivery room (DR) CPR. High quality chest compressions (CC) improve cerebral and myocardial perfusion. Improved myocardial perfusion increases the likelihood of a faster ROSC. Thus, optimizing CC quality may improve outcomes both by preserving cerebral blood flow during CPR and by reducing the recovery time. CC quality is determined by rate, CC to ventilation (C:V) ratio, and applied force, which are influenced by the CC provider. Thus, provider performance should be taken into account. Neonatal resuscitation guidelines recommend a 3:1 C:V ratio. CCs should be delivered at a rate of 90/min synchronized with ventilations at a rate of 30/min to achieve a total of 120 events/min. Despite a lack of scientific evidence supporting this, the investigation of alternative CC interventions in human neonates is ethically challenging. Also, the infrequent occurrence of extensive CPR measures in the DR make randomized controlled trials difficult to perform. Thus, many biomechanical aspects of CC have been investigated in animal and manikin models. Despite mathematical and physiological rationales that higher rates and uninterrupted CC improve CPR hemodynamics, studies indicate that provider fatigue is more pronounced when CC are performed continuously compared to when a pause is inserted after every third CC as currently recommended. A higher rate (e.g., 120/min) is also more fatiguing, which affects CC quality. In post-transitional piglets with asphyxia-induced cardiac arrest, there was no benefit of performing continuous CC at a rate of 90/min. Not only rate but duty cycle, i.e., the duration of CC/total cycle time, is a known determinant of CC effectiveness. However, duty cycle cannot be controlled with manual CC. Mechanical/automated CC in neonatal CPR has not been explored, and feedback systems are under-investigated in this

  16. Combined short- and long-axis ultrasound-guided central venous catheterization is superior to conventional techniques: A cross-over randomized controlled manikin trial.

    Jun Takeshita

    Full Text Available Visualizing the needle tip using the short-axis (SA ultrasound-guided central venous catheterization approach can be challenging. It has been suggested to start the process with the SA approach and then switch to the long-axis (LA; however, to our knowledge, this combination has not been evaluated. We compared the combined short- and long-axis (SLA approach with the SA approach in a manikin study.We performed a prospective randomized controlled cross-over study in an urban emergency department and intensive care unit. Resident physicians in post-graduate years 1-2 performed a simulated ultrasound-guided internal jugular vein puncture using the SA and SLA approaches on manikins. Twenty resident physicians were randomly assigned to two equal groups: (1 one group performed punctures using the SA approach followed by SLA; and (2 the other performed the same procedures in the opposite order. We compared the success rate and procedure duration for the two approaches. Procedural success was defined as insertion of the guide-wire into the vein while visualizing the needle tip at the time of anterior wall puncture, without penetrating the posterior wall.Six resident physicians (30% performed both approaches successfully, while 12 (60% performed the SLA approach, but not the SA, successfully. Those who performed the SA approach successfully also succeeded with the SLA approach. Two resident physicians (10% failed to perform both approaches. The SLA approach had a significantly higher success rate than the SA approach (P < 0.001. The median (interquartile range procedure duration was 59.5 [46.0-88.5] seconds and 45.0 [37.5-84.0] seconds for the SLA and SA approaches, respectively. The difference of the duration between the two procedures was 15.5 [0-28.5] seconds. There was no significant difference in duration between the two approaches (P = 0.12.Using the SLA approach significantly improved the success rate of internal jugular vein puncture performed by

  17. Effects of breathing frequency and flow rate on the total inward leakage of an elastomeric half-mask donned on an advanced manikin headform.

    He, Xinjian; Grinshpun, Sergey A; Reponen, Tiina; McKay, Roy; Bergman, Michael S; Zhuang, Ziqing

    2014-03-01

    The objective of this study was to investigate the effects of breathing frequency and flow rate on the total inward leakage (TIL) of an elastomeric half-mask donned on an advanced manikin headform and challenged with combustion aerosols. An elastomeric half-mask respirator equipped with P100 filters was donned on an advanced manikin headform covered with life-like soft skin and challenged with aerosols originated by burning three materials: wood, paper, and plastic (polyethylene). TIL was determined as the ratio of aerosol concentrations inside (C in) and outside (C out) of the respirator (C in/C out) measured with a nanoparticle spectrometer operating in the particle size range of 20-200nm. The testing was performed under three cyclic breathing flows [mean inspiratory flow (MIF) of 30, 55, and 85 l/min] and five breathing frequencies (10, 15, 20, 25, and 30 breaths/min). A completely randomized factorial study design was chosen with four replicates for each combination of breathing flow rate and frequency. Particle size, MIF, and combustion material had significant (P plastic aerosol produced higher mean TIL values than wood and paper aerosols. The effect of the breathing frequency was complex. When analyzed using all combustion aerosols and MIFs (pooled data), breathing frequency did not significantly (P = 0.08) affect TIL. However, once the data were stratified according to combustion aerosol and MIF, the effect of breathing frequency became significant (P plastic combustion aerosol. The effect of breathing frequency on TIL is less significant than the effects of combustion aerosol and breathing flow rate for the tested elastomeric half-mask respirator. The greatest TIL occurred when challenged with plastic aerosol at 30 l/min and at a breathing frequency of 30 breaths/min.

  18. NEDO Forum 2001. Session on development of geothermal energy (Prospect of geothermal energy); NEDO Forum 2001. Chinetsu kaihatsu session (chinetsu energy no tenbo)

    NONE

    2001-09-20

    The presentations made at the above-named session of the NEDO (New Energy and Industrial Technology Development Organization) forum held in Tokyo on September 20, 2001, are collected in this report. Director Noda of Institute for Geo-Resources and Environment, National Institute of Advanced Industrial Science and Technology, delivered a lecture entitled 'Future course of geothermal technology development,' and Executive Director Iikura of Tokyo Toshi Kaihatsu, Inc., a lecture entitled 'Thinking of geothermal energy.' Described in an achievement report entitled 'Present state and future trend of geothermal development' were the present state of geothermal power generation and characteristics of geothermal energy, signification of the introduction of binary cycle power generation, and the promotion of the introduction of ground heat utilizing heat pump systems. Stated in a lecture entitled 'Geothermal development promotion survey' were the geothermal development promotion survey and its result and how to implement such surveys in the future. Reported in a lecture entitled 'Verification survey of geothermal energy probing technology and the like and the development of geothermal water utilizing power plant and the like' were reservoir fluctuation probing, deep-seated thermal resource probing and collecting, 10-MW class demonstration plant, Measurement While Drilling System, and a hot rock power generation system. (NEDO)

  19. Achievement report for fiscal 1984 on Sunshine Program-entrusted research and development. Research and development of amorphous solar cells (Theoretical research on amorphous silicon electronic states by computer-aided simulation); 1984 nendo amorphous taiyo denchi no kenkyu kaihatsu seika hokokusho. Keisanki simulation ni yoru amorphous silicon no denshi jotai no rironteki kenkyu

    NONE

    1985-04-01

    Research on the basic physical properties of amorphous silicon materials and for the development of materials for thermally stable amorphous silicon is conducted through theoretical reasoning and computer-aided simulation. In the effort at achieving a high conversion efficiency using an amorphous silicon alloy, a process of realizing desired photoabsorption becomes possible when the correlation between the atomic structure and the photoabsorption coefficient is clearly established and the atomic structure is manipulated. In this connection, analytical studies are conducted to determine how microscopic structures are reflected on macroscopic absorption coefficients. In the computer-aided simulation, various liquid structures and amorphous structures are worked out, which is for the atom-level characterization of structures with topological disturbances, such as amorphous structures. Glass transition is simulated using a molecular kinetic method, in particular, and the melting of crystals, crystallization of liquids, and vitrification (conversion into the amorphous state) are successfully realized, though in a computer-aided simulation, for the first time in the world. (NEDO)

  20. How do different brands of size 1 laryngeal mask airway compare with face mask ventilation in a dedicated laryngeal mask airway teaching manikin?

    Tracy, Mark Brian; Priyadarshi, Archana; Goel, Dimple; Lowe, Krista; Huvanandana, Jacqueline; Hinder, Murray

    2018-05-01

    International neonatal resuscitation guidelines recommend the use of laryngeal mask airway (LMA) with newborn infants (≥34 weeks' gestation or >2 kg weight) when bag-mask ventilation (BMV) or tracheal intubation is unsuccessful. Previous publications do not allow broad LMA device comparison. To compare delivered ventilation of seven brands of size 1 LMA devices with two brands of face mask using self-inflating bag (SIB). 40 experienced neonatal staff provided inflation cycles using SIB with positive end expiratory pressure (PEEP) (5 cmH 2 O) to a specialised newborn/infant training manikin randomised for each LMA and face mask. All subjects received prior education in LMA insertion and BMV. 12 415 recorded inflations for LMAs and face masks were analysed. Leak detected was lowest with i-gel brand, with a mean of 5.7% compared with face mask (triangular 42.7, round 35.7) and other LMAs (45.5-65.4) (p<0.001). Peak inspiratory pressure was higher with i-gel, with a mean of 28.9 cmH 2 O compared with face mask (triangular 22.8, round 25.8) and other LMAs (14.3-22.0) (p<0.001). PEEP was higher with i-gel, with a mean of 5.1 cmH 2 O compared with face mask (triangular 3.0, round 3.6) and other LMAs (0.6-2.6) (p<0.001). In contrast to other LMAs examined, i-gel had no insertion failures and all users found i-gel easy to use. This study has shown dramatic performance differences in delivered ventilation, mask leak and ease of use among seven different brands of LMA tested in a manikin model. This coupled with no partial or complete insertion failures and ease of use suggests i-gel LMA may have an expanded role with newborn resuscitation as a primary resuscitation device. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Evaluation of intubation using the Airtraq or Macintosh laryngoscope by anaesthetists in easy and simulated difficult laryngoscopy--a manikin study.

    Maharaj, C H

    2006-05-01

    The Airtraq Laryngoscope is a novel intubation device which allows visualisation of the vocal cords without alignment of the oral, pharyngeal and tracheal axes. We compared the Airtraq with the Macintosh laryngoscope in simulated easy and difficult laryngoscopy. Twenty-five anaesthetists were allowed up to three attempts to intubate the trachea in each of three laryngoscopy scenarios using a Laerdal Intubation Trainer followed by five scenarios using a Laerdal SimMan Manikin. Each anaesthetist then performed tracheal intubation of the normal airway a second time to characterise the learning curve. In the simulated easy laryngoscopy scenarios, there was no difference between the Airtraq and the Macintosh in success of tracheal intubation. The time taken to intubate at the end of the protocol was significantly lower using the Airtraq (9.5 (6.7) vs. 14.2 (7.4) s), demonstrating a rapid acquisition of skills. In the simulated difficult laryngoscopy scenarios, the Airtraq was more successful in achieving tracheal intubation, required less time to intubate successfully, caused less dental trauma, and was considered by the anaesthetists to be easier to use.

  2. Upper limb muscular activity and perceived workload during laryngoscopy: comparison of Glidescope(R) and Macintosh laryngoscopy in manikin: an observational study.

    Caldiroli, D; Molteni, F; Sommariva, A; Frittoli, S; Guanziroli, E; Cortellazzi, P; Orena, E F

    2014-03-01

    The interaction between operators and their working environment during laryngoscopy is poorly understood. Numerous studies have focused on the forces applied to the patient's airway during laryngoscopy, but only a few authors have addressed operator muscle activity and workload. We tested whether different devices (Glidescope(®) and Macintosh) use different muscles and how these differences affect the perceived workload. Ten staff anaesthetists performed three intubations with each device on a manikin. Surface electromyography was recorded for eight single muscles of the left upper limb. The NASA Task Load Index (TLX) was administered after each experimental session to evaluate perceived workload. A consistent reduction in muscular activation occurred with Glidescope(®) compared with Macintosh for all muscles tested (mean effect size d=3.28), and significant differences for the upper trapezius (P=0.002), anterior deltoid (P=0.001), posterior deltoid (P=0.000), and brachioradialis (P=0.001) were observed. The overall NASA-TLX workload score was significantly lower for Glidescope(®) than for Macintosh (P=0.006), and the factors of physical demand (P=0.008) and effort (P=0.006) decreased significantly. Greater muscular activity and workload were observed with the Macintosh laryngoscope. Augmented vision and related postural adjustments related to using the Glidescope(®) may reduce activation of the operator's muscles and task workload.

  3. The effects of changes to the ERC resuscitation guidelines on no flow time and cardiopulmonary resuscitation quality: a randomised controlled study on manikins.

    Jäntti, H; Kuisma, M; Uusaro, A

    2007-11-01

    The European Resuscitation Council (ERC) guidelines changed in 2005. We investigated the impact of these changes on no flow time and on the quality of cardiopulmonary resuscitation (CPR). Simulated cardiac arrest (CA) scenarios were managed randomly in manikins using ERC 2000 or 2005 guidelines. Pairs of paramedics/paramedic students treated 34 scenarios with 10min of continuous ventricular fibrillation. The rhythm was analysed and defibrillation shocks were delivered with a semi-automatic defibrillator, and breathing was assisted with a bag-valve-mask; no intravenous medication was given. Time factors related to human intervention and time factors related to device, rhythm analysis, charging and defibrillation were analysed for their contribution to no flow time (time without chest compression). Chest compression quality was also analysed. No flow time (mean+/-S.D.) was 66+/-3% of CA time with ERC 2000 and 32+/-4% with ERC 2005 guidelines (PERC 2000) versus 107+/-4s (ERC 2005) during 600-s scenarios (P=0.237). Device factor interventions took longer using ERC 2000 guidelines: 290+/-19s versus 92+/-15s (PERC 2005 guidelines (808+/-92s versus 458+/-90s, P<0.001), but the quality of CPR did not differ between the groups. The use of a single shock sequence with guidelines 2005 has decreased the no flow time during CPR when compared with guidelines 2000 with multiple shocks.

  4. An analysis of the efficacy of bag-valve-mask ventilation and chest compression during different compression-ventilation ratios in manikin-simulated paediatric resuscitation.

    Kinney, S B; Tibballs, J

    2000-01-01

    The ideal chest compression and ventilation ratio for children during performance of cardiopulmonary resuscitation (CPR) has not been determined. The efficacy of chest compression and ventilation during compression ventilation ratios of 5:1, 10:2 and 15:2 was examined. Eighteen nurses, working in pairs, were instructed to provide chest compression and bag-valve-mask ventilation for 1 min with each ratio in random on a child-sized manikin. The subjects had been previously taught paediatric CPR within the last 3 or 5 months. The efficacy of ventilation was assessed by measurement of the expired tidal volume and the number of breaths provided. The rate of chest compression was guided by a metronome set at 100/min. The efficacy of chest compressions was assessed by measurement of the rate and depth of compression. There was no significant difference in the mean tidal volume or the percentage of effective chest compressions delivered for each compression-ventilation ratio. The number of breaths delivered was greatest with the ratio of 5:1. The percentage of effective chest compressions was equal with all three methods but the number of effective chest compressions was greatest with a ratio of 5:1. This study supports the use of a compression-ventilation ratio of 5:1 during two-rescuer paediatric cardiopulmonary resuscitation.

  5. Fiscal 1997 report on the results of the international standardization R and D. Overall survey; 1997 nendo seika hokokusho kokusai hyojun soseigata kenkyu kaihatsu. Sogo chosa

    NONE

    1998-03-01

    The paper summed up the results of the survey of tackling with the international standardization by enterprises and organizations involved in the standardization in the U.S., Europe and Japan, the results of the R and D on 20 themes for the international standardization, and the development toward the international standardization. As the R and D themes, the following were selected: the development of chemical method to analyze/evaluate metallic coatings of surface treated steel coating sheets, R and D of the basic evaluation method of functional composite particles, international standards for computer/manikins, basic technology of color image management, study on optimizing design/evaluation technology using quality engineering, study of the international standardization by economic evaluation of environmental impacts, R and D of the standardization of a method to test acceleration life of phosphorous acid fuel cells, development of a test method of halogen-free flame-retardant cables and a study of the cable standards, standard measuring methods of hormone effects of chemical substances, quantification of the sensory evaluation and international standardization in the paint field, experimental study on the international standardization for immunochemical measurement of chemical substances, etc. 24 figs., 16 tabs.

  6. Warrior Injury Assessment Manikin (WIAMan) Lumbar Spine Model Validation: Development, Testing, and Analysis of Physical and Computational Models of the WIAMan Lumbar Spine Materials Demonstrator

    2016-08-01

    Fig. 13 Lumbar spine assembly in alignment fixture .......................................22 Fig. 14 Double -lap shear coupons before and after...Fig. 7). Strain data were determined from piston displacement, which was verified using a Vision Research Phantom v711 high-speed monochrome... piston would not engage the sample until it had reached the optimal velocity. The slot was around 5 inches in length and the ramp up travel was

  7. Dispatcher-assisted compression-only cardiopulmonary resuscitation provides best quality cardiopulmonary resuscitation by laypersons: A randomised controlled single-blinded manikin trial.

    Spelten, Oliver; Warnecke, Tobias; Wetsch, Wolfgang A; Schier, Robert; Böttiger, Bernd W; Hinkelbein, Jochen

    2016-08-01

    High-quality cardiopulmonary resuscitation (CPR) by laypersons is a key determinant of both outcome and survival for out-of-hospital cardiac arrest. Dispatcher-assisted CPR (telephone-CPR, T-CPR) increases the frequency and correctness of bystander-CPR but results in prolonged time to first chest compressions. However, it remains unclear whether instructions for rescue ventilation and/or chest compressions should be recommended for dispatcher-assisted CPR. The aim of this study was to evaluate both principles of T-CPR with respect to CPR quality. Randomised controlled single-blinded manikin trial. University Hospital of Cologne, Germany, 1 July 2012 to 30 September 2012. Sixty laypersons between 18 and 65 years. Medically educated individuals, medical professionals and pregnant women were excluded. Participants were asked to resuscitate a manikin and were randomised into three groups: not dispatcher-assisted (uninstructed) CPR (group 1; U-CPR; n = 20), dispatcher-assisted compression-only CPR (group 2; DACO-CPR; n = 19) and full dispatcher-assisted CPR with rescue ventilation (group 3; DAF-CPR; n = 19). Specific parameters of CPR quality [i.e. no-flow-time (NFT) as well as compression and ventilation parameters] were analysed. To compare different groups we used Student's t test and P less than 0.05 was considered significant. Initial NFT was lowest in the DACO-CPR group (mean 21.3 ± 14.4%), followed by dispatcher-assisted full CPR (mean 49.1 ± 8.5%) and by unassisted CPR (mean 55.0 ± 12.9%). Initial NFT covering the time of instruction was lower in DACO-CPR (12.1 ± 5.4%) as compared to dispatcher-assisted full CPR (20.7 ± 8.1%). Compression depth was similar in all three groups: 40.6 ± 13.0 mm (unassisted CPR), 41.0 ± 12.2 mm (DACO-CPR) and 38.8 ± 15.8 mm (dispatcher-assisted full CPR). Average compression frequency was highest in the DACO-CPR group (65.2 ± 22.4 min) compared with the unassisted CPR

  8. An audiovisual feedback device for compression depth, rate and complete chest recoil can improve the CPR performance of lay persons during self-training on a manikin

    Krasteva, Vessela; Jekova, Irena; Didon, Jean-Philippe

    2011-01-01

    This study aims to contribute to the scarce data available about the abilities of untrained lay persons to perform hands-only cardio-pulmonary resuscitation (CPR) on a manikin and the improvement of their skills during training with an autonomous CPR feedback device. The study focuses on the following questions: (i) Is there a need for such a CPR training device? (ii) How adequate are the embedded visual feedback and audio guidance for training of lay persons who learn and correct themselves in real time without instructor guidance? (iii) What is the achieved effect of only 3 min of training? This is a prospective study in which 63 lay persons (volunteers) received a debriefing to basic life support and then performed two consecutive 3 min trials of hands-only CPR on a manikin. The pre-training skills of the lay persons were tested in trial 1. The training process with audio guidance and visual feedback from a cardio compression control device (CC-Device) was recorded in trial 2. After initial debriefing for correct chest compressions (CC) with rate 85–115 min −1 , depth 3.8–5.4 cm and complete recoil, in trial 1 the lay persons were able to perform CC without feedback at mean rate 95.9 ± 18.9 min −1 , mean depth 4.13 ± 1.5 cm, with low proportions of 'correct depth', 'correct rate' and 'correct recoil' at 33%, 43%, 87%, resulting in the scarce proportion of 14% for compressions, which simultaneously fulfill the three quality criteria ('correct all'). In trial 2, the training process by the CC-Device was established by the significant improvement of the CC skills until the 60th second of training, when 'correct depth', 'correct rate' and 'correct recoil' attained the plateau of the highest quality at 82%, 90%, 96%, respectively, resulting in 73% 'correct all' compressions within 3 min of training. The training was associated with reduced variance of the mean rate 102.4 ± 4

  9. Efficacy of metronome sound guidance via a phone speaker during dispatcher-assisted compression-only cardiopulmonary resuscitation by an untrained layperson: a randomised controlled simulation study using a manikin.

    Park, Sang O; Hong, Chong Kun; Shin, Dong Hyuk; Lee, Jun Ho; Hwang, Seong Youn

    2013-08-01

    Untrained laypersons should perform compression-only cardiopulmonary resuscitation (COCPR) under a dispatcher's guidance, but the quality of the chest compressions may be suboptimal. We hypothesised that providing metronome sounds via a phone speaker may improve the quality of chest compressions during dispatcher-assisted COCPR (DA-COCPR). Untrained laypersons were allocated to either the metronome sound-guided group (MG), who performed DA-COCPR with metronome sounds (110 ticks/min), or the control group (CG), who performed conventional DA-COCPR. The participants of each group performed DA-COCPR for 4 min using a manikin with Skill-Reporter, and the data regarding chest compression quality were collected. The data from 33 cases of DA-COCPR in the MG and 34 cases in the CG were compared. The MG showed a faster compression rate than the CG (111.9 vs 96.7/min; p=0.018). A significantly higher proportion of subjects in the MG performed the DA-COCPR with an accurate chest compression rate (100-120/min) compared with the subjects in the CG (32/33 (97.0%) vs 5/34 (14.7%); pMetronome sound guidance during DA-COCPR for the untrained bystanders improved the chest compression rates, but was associated more with shallow compressions than the conventional DA-COCPR in a manikin model.

  10. Effect of feedback on delaying deterioration in quality of compressions during 2 minutes of continuous chest compressions: a randomized manikin study investigating performance with and without feedback

    Lyngeraa Tobias

    2012-02-01

    Full Text Available Abstract Background Good quality basic life support (BLS improves outcome following cardiac arrest. As BLS performance deteriorates over time we performed a parallel group, superiority study to investigate the effect of feedback on quality of chest compression with the hypothesis that feedback delays deterioration of quality of compressions. Methods Participants attending a national one-day conference on cardiac arrest and CPR in Denmark were randomized to perform single-rescuer BLS with (n = 26 or without verbal and visual feedback (n = 28 on a manikin using a ZOLL AED plus. Data were analyzed using Rescuenet Code Review. Blinding of participants was not possible, but allocation concealment was performed. Primary outcome was the proportion of delivered compressions within target depth compared over a 2-minute period within the groups and between the groups. Secondary outcome was the proportion of delivered compressions within target rate compared over a 2-minute period within the groups and between the groups. Performance variables for 30-second intervals were analyzed and compared. Results 24 (92% and 23 (82% had CPR experience in the group with and without feedback respectively. 14 (54% were CPR instructors in the feedback group and 18 (64% in the group without feedback. Data from 26 and 28 participants were analyzed respectively. Although median values for proportion of delivered compressions within target depth were higher in the feedback group (0-30 s: 54.0%; 30-60 s: 88.0%; 60-90 s: 72.6%; 90-120 s: 87.0%, no significant difference was found when compared to without feedback (0-30 s: 19.6%; 30-60 s: 33.1%; 60-90 s: 44.5%; 90-120 s: 32.7% and no significant deteriorations over time were found within the groups. In the feedback group a significant improvement was found in the proportion of delivered compressions below target depth when the subsequent intervals were compared to the first 30 seconds (0-30 s: 3.9%; 30-60 s: 0.0%; 60-90 s: 0

  11. Development of field navigation system; Field navigation system no kaihatsu

    Ibara, S; Minode, M; Nishioka, K [Daihatsu Motor Co. Ltd., Osaka (Japan)

    1995-04-20

    This paper describes the following matters on a field navigation system developed for the purpose of covering a field of several kilometer square. This system consists of a center system and a vehicle system, and the center system comprises a map information computer and a communication data controlling computer; since the accuracy for a vehicle position detected by a GPS is not sufficient, an attempt of increasing the accuracy of vehicle position detection is made by means of a hybrid system; the hybrid system uses a satellite navigation method of differential system in which the error components in the GPS are transmitted from the center, and also uses a self-contained navigation method which performs an auxiliary function when the accuracy in the GPS has dropped; corrected GPS values, emergency messages to all of the vehicles and data of each vehicle position are communicated by wireless transmission in two ways between the center and vehicles; and accommodation of the map data adopted a system that can respond quickly to any change in roads and facilities. 3 refs., 13 figs., 1 tab.

  12. Hands-Off Time for Endotracheal Intubation during CPR Is Not Altered by the Use of the C-MAC Video-Laryngoscope Compared to Conventional Direct Laryngoscopy. A Randomized Crossover Manikin Study.

    Philipp Schuerner

    Full Text Available Sufficient ventilation and oxygenation through proper airway management is essential in patients undergoing cardio-pulmonary resuscitation (CPR. Although widely discussed, securing the airway using an endotracheal tube is considered the standard of care. Endotracheal intubation may be challenging and causes prolonged interruption of chest compressions. Videolaryngoscopes have been introduced to better visualize the vocal cords and accelerate intubation, which makes endotracheal intubation much safer and may contribute to intubation success. Therefore, we aimed to compare hands-off time and intubation success of direct laryngoscopy with videolaryngoscopy (C-MAC, Karl Storz, Tuttlingen, Germany in a randomized, cross-over manikin study.Twenty-six anesthesia residents and twelve anesthesia consultants of the University Hospital Zurich were recruited through a voluntary enrolment. All participants performed endotracheal intubation using direct laryngoscopy and C-MAC in a random order during ongoing chest compressions. Participants were strictly advised to stop chest compression only if necessary.The median hands-off time was 1.9 seconds in direct laryngoscopy, compared to 3 seconds in the C-MAC group. In direct laryngoscopy 39 intubation attempts were recorded, resulting in an overall first intubation attempt success rate of 97%, compared to 38 intubation attempts and 100% overall first intubation attempt success rate in the C-MAC group.As a conclusion, the results of our manikin-study demonstrate that video laryngoscopes might not be beneficial compared to conventional, direct laryngoscopy in easily accessible airways under CPR conditions and in experienced hands. The benefits of video laryngoscopes are of course more distinct in overcoming difficult airways, as it converts a potential "blind intubation" into an intubation under visual control.

  13. Hands-Off Time for Endotracheal Intubation during CPR Is Not Altered by the Use of the C-MAC Video-Laryngoscope Compared to Conventional Direct Laryngoscopy. A Randomized Crossover Manikin Study.

    Schuerner, Philipp; Grande, Bastian; Piegeler, Tobias; Schlaepfer, Martin; Saager, Leif; Hutcherson, Matthew T; Spahn, Donat R; Ruetzler, Kurt

    2016-01-01

    Sufficient ventilation and oxygenation through proper airway management is essential in patients undergoing cardio-pulmonary resuscitation (CPR). Although widely discussed, securing the airway using an endotracheal tube is considered the standard of care. Endotracheal intubation may be challenging and causes prolonged interruption of chest compressions. Videolaryngoscopes have been introduced to better visualize the vocal cords and accelerate intubation, which makes endotracheal intubation much safer and may contribute to intubation success. Therefore, we aimed to compare hands-off time and intubation success of direct laryngoscopy with videolaryngoscopy (C-MAC, Karl Storz, Tuttlingen, Germany) in a randomized, cross-over manikin study. Twenty-six anesthesia residents and twelve anesthesia consultants of the University Hospital Zurich were recruited through a voluntary enrolment. All participants performed endotracheal intubation using direct laryngoscopy and C-MAC in a random order during ongoing chest compressions. Participants were strictly advised to stop chest compression only if necessary. The median hands-off time was 1.9 seconds in direct laryngoscopy, compared to 3 seconds in the C-MAC group. In direct laryngoscopy 39 intubation attempts were recorded, resulting in an overall first intubation attempt success rate of 97%, compared to 38 intubation attempts and 100% overall first intubation attempt success rate in the C-MAC group. As a conclusion, the results of our manikin-study demonstrate that video laryngoscopes might not be beneficial compared to conventional, direct laryngoscopy in easily accessible airways under CPR conditions and in experienced hands. The benefits of video laryngoscopes are of course more distinct in overcoming difficult airways, as it converts a potential "blind intubation" into an intubation under visual control.

  14. Optical Computing

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  15. Computer group

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  16. Computer Engineers.

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  17. Computer Music

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  18. Analog computing

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  19. Computational composites

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  20. Quantum Computing

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  1. CO2 Washout Capability with Breathing Manikin

    National Aeronautics and Space Administration — Carbon Dioxide (CO2) Washout performance is a critical parameter needed to ensure proper and sufficient designs in a spacesuit and in vehicle applications such as...

  2. Modelo simulador para treinamento de punção transpedicular em vertebroplastia percutânea Manikin-type training simulator model for transpedicular puncture in percutaneous vertebroplasty

    Nitamar Abdala

    2007-08-01

    manikin with an ethyl-vinyl-acetate lining so that direct visualization was not possible. A theoretical course was given to six trainees in radiology and neuroradiology who have tested the models with respect to parameters of similarity with the reality, performing 30 transpedicular punctures in three series of ten punctures a day, with one-week interval between the series. RESULTS: Each student performed 30 transpedicular punctures; however, eight of these punctures were disregarded because of manufacturing defects of the dummies observed during the procedures. Similarity data forms were filled in by all of the trainees following the procedures, with 100% of positive answers as regards the models similarity with the human body. CONCLUSION: It was possible to develop a training model for transpedicular puncture with a satisfactory degree of similarity with the human body, constituting an appropriate tool for training in vertebroplasty.

  3. Computational Medicine

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  4. Grid Computing

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  5. Green Computing

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  6. Quantum computers and quantum computations

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  7. Quantum Computing for Computer Architects

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  8. Pervasive Computing

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  9. Computational vision

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  10. Spatial Computation

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  11. Parallel computations

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  12. Human Computation

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  13. Quantum computation

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  14. Computer software.

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Computer sciences

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  16. Computer programming and computer systems

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  17. Organic Computing

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  18. Computational biomechanics

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  19. Computational Composites

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  20. GPGPU COMPUTING

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  1. Quantum Computing

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Platform computing

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  3. Quantum Computing

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  4. Quantum computing

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  5. Computational Pathology

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  6. Cloud Computing

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  7. Computability theory

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  8. Computational Streetscapes

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  9. COMPUTATIONAL THINKING

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  10. Computer interfacing

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  11. Computational physics

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  12. Computational physics

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  13. Cloud Computing

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  14. Computational Viscoelasticity

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  15. Optical computing.

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  16. Computational physics

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  17. Phenomenological Computation?

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  18. Development of an automatic emergency reporting system; Jiko jido tsuho system no kaihatsu

    Kawai, A; Sekine, M; Kodama, R; Matsumura, K [Nissan Motor Co. Ltd., Tokyo (Japan)

    1995-06-30

    This paper proposes an automatic emergency reporting system as an ASV technology for preventing secondary damage. In the event a vehicle is involved in an accident or other emergency situation, this system automatically reports the vehicle`s present position along with information on the vehicle and owner to an operations center via radio signals. This makes it possible to dispatch an ambulance or other emergency vehicle more quickly. A prototype simulation system has been built consisting of a custom designed control unit for in-vehicle use and a personal computer that simulates an operations center. The interface between the control unit and the personal computer is a wireless modem. The navigation system offered in the Cedric was modified for use as the vehicle location sensor and map database of the operations center. In experiments conducted on the system, information was transmitted from the control unit and shown on a digital map display on the personal computer screen in about ten seconds following activation of an emergency signal. 5 figs.

  19. Essentials of cloud computing

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  20. Personal Computers.

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  1. Computational Literacy

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  2. Computing Religion

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  3. Computational Controversy

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  4. Grid Computing

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  5. Computer tomographs

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  6. Computational sustainability

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  7. Computing farms

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  8. Computational chemistry

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  9. Hierarchical functional model for automobile development; Jidosha kaihatsu no tame no kaisogata kino model

    Sumida, S [U-shin Ltd., Tokyo (Japan); Nagamatsu, M; Maruyama, K [Hokkaido Institute of Technology, Sapporo (Japan); Hiramatsu, S [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    A new approach on modeling is put forward in order to compose the virtual prototype which is indispensable for fully computer integrated concurrent development of automobile product. A basic concept of the hierarchical functional model is proposed as the concrete form of this new modeling technology. This model is used mainly for explaining and simulating functions and efficiencies of both the parts and the total product of automobile. All engineers who engage themselves in design and development of automobile can collaborate with one another using this model. Some application examples are shown, and usefulness of this model is demonstrated. 5 refs., 5 figs.

  10. Computational creativity

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  11. Fiscal 1997 R and D project on industrial science and technology under a consignment from NEDO. R and D of the ultimate manipulation technology of atoms and molecules; 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu jigyo Shin energy Sangyo gijutsu Sogo Kaihatsu Kiko itaku. Genshi bunshi kyokugen sosa gijutsu no kenkyu kaihatsu seika hokokusho (genshi bunshi kyokugen sosa gijutsu no kenkyu kaihatsu)

    NONE

    1998-03-01

    This paper describes R and D of the ultimate manipulation technology of atoms and molecules (atom technology). The R and D aims at establishment of observation/manipulation technology of atoms and molecules as common basic technology in various industrial fields such as new material, electronics, bio-technology and chemistry. The R and D thus aims at establishment of observation/manipulation of solid surfaces and DNA organic molecules, formation of fine structures of atomic surface arrangement, and calculation/ simulation for predicting a reaction theorem of atom and molecule surfaces. In fiscal 1997, research was made on improvement and development of computer simulation environment, and description of an excited state of electrons by Green function. Establishment of a construction method and computation code is under investigation for pseudo- potential dependent on excitation energy. Survey was made on research trends of the atom technology by visiting overseas academic societies and institutions. International Symposium on Atom Technology was also held in Tokyo in Nov. 1997

  12. Quantum computing

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  13. Quantum computing

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  14. Multiparty Computations

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  15. Scientific computing

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  16. Computational Psychiatry

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  17. Computational artifacts

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Computer security

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. Cloud Computing

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  20. Computational Logistics

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  1. Computational Logistics

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  2. Computational engineering

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  3. Computer busses

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  4. Reconfigurable Computing

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  5. Development of a new automotive navigation system; Shingata navigation system no kaihatsu

    Sone, M; Nakano, H; Nakayama, O; Tanemura, E; Yoshitsugu, N; Watanabe, M [Nissan Motor Co. Ltd., Tokyo (Japan)

    1996-01-31

    An automotive navigation system was outlined. Features of this system are described below: map display called `Bird View` extending up to the horizon was commercialized; accuracy of determining the vehicle`s present position was realized using new algorithm; and automatic route selection was adopted. Human machine interface of this system also was completely reviewed. `Bird View` was realized by reading plane map data out from CD-ROM and converting them onto the coordinate on the virtual screen in front of the view point. Automatic selection which depends mostly on self-contained navigation adopts the certain way in comparison of the computation position in GPS. To assume vehicle advancing direction, employed were optical fiber gyroscope, geomagnetic sensor, and Karman filter making a good use of advantages of GPS, for the improvement of accuracy. For the automatic distance correction, a function of correcting pulse-distance conversion coefficient was employed, and the free maintenance was realized. 5 figs.

  6. Riemannian computing in computer vision

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  7. Statistical Computing

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  8. Computational biology

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  9. Computing News

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  10. Quantum Computation

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  11. Cloud computing.

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  12. Computer Recreations.

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  13. [Grid computing

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  14. Computational Finance

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  15. Optical Computing

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  16. Development of measurement apparatus for high resolution electrical surveys; Komitsudo denki tansa sokuteiki no kaihatsu

    Moriuchi, H; Matsuda, Y; Shiokawa, Y [Sumiko Consultants Co. Ltd., Tokyo (Japan); Uchino, Y [Cosmic Co. Ltd., Tokyo (Japan)

    1996-05-01

    For the enforcement of the {rho}a-{rho}u survey method which is a type of high-density electrical survey, a multichannel resistivity measuring instrument has been developed. This instrument, in addition to the above, conducts resistivity tomography and various other kinds of high-density electrical survey. A potential produced by a low frequency rectangular current of 1Hz or lower outputted by the transmitter of this instrument is received and measured by the receiver connected to electrodes positioned at 100 or less locations. The receiver comprises a scanner that automatically switches from electrode to electrode, conditioner that processes signals, and controller. A transmitter of the standard design outputs a maximum voltage of 800V and maximum current of 2A, making a device suitable for probing 50 to several 100m-deep levels. The receiver is operated by a personal computer that the controller is provided with. The newly-developed apparatus succeeded in presenting high-precision images of the result of a {rho}a-{rho}u analysis for an apparent resistivity section and of the underground structure, verifying the high quality of the data collected by this apparatus. 10 refs., 5 figs., 1 tab.

  17. Development of pressurized internally circulating fluidized bed combustion technology; Kaatsu naibu junkan ryudosho boiler no kaihatsu

    Ishihara, I [Center for Coal Utilization, Japan, Tokyo (Japan); Nagato, S; Toyoda, S [Ebara Corp., Tokyo (Japan)

    1996-09-01

    The paper introduced support research on element technology needed for the design of hot models of the pressurized internally circulating fluidized bed combustion boiler in fiscal 1995 and specifications for testing facilities of 4MWt hot models after finishing the basic plan. The support research was conduced as follows: (a) In the test for analysis of cold model fluidization, it was confirmed that each characteristic value of hot models is higher than the target value. Further, calculation parameters required for computer simulation were measured and data on the design of air diffusion nozzle for 1 chamber wind box were sampled. (b) In the CWP conveyance characteristic survey, it was confirmed that it is possible to produce CWP having favorable properties. It was also confirmed that favorable conveyability can be maintained even if the piping size was reduced down to 25A. (c) In the gas pressure reducing test, basic data required for the design of gas pressure reducing equipment were sampled. Specifications for the fluidized bed combustion boiler of hot models are as follows: evaporation amount: 3070kg/h, steam pressure: 1.77MPa, fuel supply amount: 600kg-coal/h, boiler body: cylinder shape water tube internally circulating fluidized bed combustion boiler. 4 refs., 4 figs.

  18. PV glass curtain walls; Kenzai ittaigata taiyo denchi gaiheki no kaihatsu (glass curtain wall eno tekiyo)

    Yoshida, T.; Iwai, T.; Ouchi, T.; Ito, T.; Nagai, T.; Shu, I. [Kajima Corp., Tokyo (Japan); Arai, T. [Showa Shell Sekiyu K.K., Tokyo (Japan); Ishikawa, N.; Tazawa, K.

    1997-12-20

    Reported in this article are PV (photovoltaic) modules now under development for integration into building walls. First of all, the power generating capability of PV modules and appropriate use of the generated power are studied, and the performance (resistance to fire or incombustibility, strength and durability, appearance and design, and dimensional standardization) that such outer wall materials are to be equipped with are determined. Next, module development, installation technique, computer graphics-aided facade designing, and real size module-using proof test are studied before installability, the power to be generated, and designs are finalized. In the development of modules, design evaluation involves the combining of various kinds of glass, solar cells, back sheets, and fillers, and the importance is confirmed of the prevention of insulation degradation around the modules. As for the methods of installation, the gasket method and aluminum sash method, etc., are tested. In the study of facade design, it is found that various expressions are possible by properly choosing gasket colors, module types, and kinds of glass to cover the openings. 1 ref., 6 figs., 3 tabs.

  19. Computable Frames in Computable Banach Spaces

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  20. Algebraic computing

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  1. Computational Controversy

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  2. Computed tomography

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  3. Computed radiography

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  4. Computational universes

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  5. Customizable computing

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  6. Computed tomography

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  7. Computing Services and Assured Computing

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  8. Computational neuroscience

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  9. Social Computing

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  10. computer networks

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  11. Computer Tree

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  12. Computed tomography

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  13. Cloud Computing: The Future of Computing

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  14. Computer Refurbishment

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  15. Computed Tomography (CT) -- Sinuses

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  16. Illustrated computer tomography

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  17. Analog and hybrid computing

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  18. Cloud Computing Fundamentals

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  19. Unconventional Quantum Computing Devices

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  20. Computing handbook computer science and software engineering

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  1. Specialized computer architectures for computational aerodynamics

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  2. Fiscal 1998 R and D report on human feeling measurement application technology. Pt. 1. Outline; 1998 nendo ningen kankaku keisoku oyo gijutsu no kenkyu kaihatsu itaku kenkyu seika hokokusho. 1. Gaiyohen

    NONE

    1999-03-01

    This report outlines the fiscal 1998 R and D result on human feeling measurement application technology. For development of assessment technology of the impact of work fatigue on human feeling (human feeling index), and assessment technology of the adaptability and affinity between human being and environment or product (environment and product adaptability index), data storage and evaluation by measuring experiment of human feeling, and modification of every index toward the final index based on the above data were carried out. Further case studies were carried out to reflect the above both indices to design of living products or residence and office environments, and new data were also collected. The database model for using previously collected human feeling data effectively, and the sweating manikin for estimating human thermal feeling reasonably were developed. In addition, the human feeling measurement manual was prepared to diffuse these technologies. The R and D system is also described. (NEDO)

  3. Applied Parallel Computing Industrial Computation and Optimization

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  4. Further computer appreciation

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. BONFIRE: benchmarking computers and computer networks

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  6. Democratizing Computer Science

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  7. Computing at Stanford.

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  8. Soft computing in computer and information science

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  9. Computational Intelligence, Cyber Security and Computational Models

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  10. Computed Tomography (CT) -- Head

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  11. Computers: Instruments of Change.

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  12. DNA computing models

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  13. Distributed multiscale computing

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  14. Computational Modeling | Bioenergy | NREL

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  15. Computer Viruses: An Overview.

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  16. Computer Virus and Trends

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  17. FY1995 development of the landscape design studio; 1995 nendo jiritsu bunsan kyochogata 'keikan studio' no kaihatsu

    NONE

    1997-04-01

    Establish a distributed design studio called 'The Landscape Studio' for designing and analyzing landscape collaboratively under Internet and multimedia computer environment. We first proposed a concept called 'Open Designing' for design works under information environment, which is composed of three kind of openness: open data, open processes, and open discussion. Based on the concept a landscape studio has been established as a total design system towards the forthcoming network and multimedia age. In the studio, a large volume of maps, images and other data are stored in the form accessible through the Internet. Using the data of several study areas, including Omotesandou street, Block 10 of Azabu, computer simulations, design games, VRML, CAVE and many other designing support tools have been developed the studio. Furthermore, the research team joined the activities of the Angkor Wat Safeguarding Project supported by UNESCO and the Japanese government. The locations and shapes of ruins in a wide area were (measured and simulated in a 3 dimensional style. The researches of Landscape Studio were reported at several exhibitions such as 'The 2nd Exhibition on Computer Aided Architectural Design' and 'The 11th Exhibition on Architecture, City and Computer'. The multimedia systems and experiments in the studio lead the field of multimedia urban and landscape design, and the research activities have greatly contributed to the education and industry of urban design. (NEDO)

  18. Plasticity: modeling & computation

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  19. Cloud Computing Quality

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  20. Computer hardware fault administration

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  1. Computer jargon explained

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  2. Computers and data processing

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  3. Computers in nuclear medicine

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  4. Advances in unconventional computing

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  5. FY 1996 Report on the industrial science and technology research and development project. R and D of brain type computer architecture; 1996 nendo nogata computer architecture no kenkyu kaihatsu seika hokokusho

    NONE

    1997-03-01

    It is an object of this project to develop an information processing device based on a completely new architecture, in order to technologically realize human-oriented information processing mechanisms, e.g., memory, learning, association of ideas, perception, intuition and value judgement. Described herein are the FY 1996 results. For development of an LSI based on a neural network in the primary visual cortex, it is confirmed that the basic circuit structure comprising the position-signal generators, memories, signal selectors and adders is suitable for development of the LSI circuit for a neural network function (Hough transform). For development of realtime parallel distributed processor (RPDP), the basic specifications are established for, e.g., local memory capacity of RPDP, functions incorporated in RPDP and number of RPDPs incorporated in the RPDP chip, operating frequency and clock supply method, and estimated power consumption and package, in order to realize the RPDP chip. For development and advanced evaluation of large-scale neural network silicon chip, the chip developed by the advanced research project is incorporated with learning rules, cell models and failure-detection circuits, to design the evaluation substrate incorporated with the above chip. The evaluation methods and implementation procedures are drawn. (NEDO)

  6. Fiscal 2000 achievement report. Welfare technosystem research and development (Kamogawa); 2000 nendo welfare technosystem kenkyu kaihatsu (Kamogawa) seika hokokusho

    NONE

    2001-03-01

    For the purpose of assisting in-home medical care, a system to enable in-home acquisition, compression, accumulation, distribution, and regeneration of auscultatory sounds (cardiac sound, respiratory sound) is added to telemedicine assisting system technologies for remotely located doctors' decisions and instructions for in-home medical care. In fiscal 2000, for improvement on the already-developed system, operationality was improved, uptake was realized of still images (digital camera pictures) mainly for diseases in dematology where such is strongly demanded, and a transmitter function was augmented. The system enables the PHS (personal handyphone system) aided exchange of medical data between portable terminals on patrol and (doctors') personal computers through the personal computer server in the hospital. Doctors are able to administer remote telemedicine now that biomimetic information on remotely located patients such as still medical images and auscultatory sounds necessary for diagnosing is available in the hospital. (NEDO)

  7. Development of measuring system for automobile unit assembly equipment; Jidosha buhin seisan setsubi no keisoku system kaihatsu

    Miura, M; Uchishiba, I; Fukunishi, T; Umemura, H [Toyota Motor Corp., Tokyo (Japan)

    1997-10-01

    This measuring system has two characteristics. The first characteristic is to shorten lead time from system design to production. The second characteristic is to accomplish system construction simply without computer technical knowledge. This measuring system consists of hardware module which are standardized according to functions and software packages which are composed of various functions. This system is put together according to purpose. When it was introduced into production line, it could shorten lead time of 1/3. 10 figs., 1 tab.

  8. Development of simulation technology on full auto air conditioning system; Auto eakon no simulation gijutsu no kaihatsu

    Fujita, N; Otsubo, Y; Matsumura, K; Sako, H [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    Mazda has developed simulation technology on control of full auto air conditioning system. We have developed the development tool based on the technology, aiming at higher controllability of full auto air conditioning system and shorter development period. The tool performs simulation on control, on-vehicle evaluation of actual load operation, collecting data and analyzing them by personal computer. This paper reports our verification results on effectiveness of the technology/ and the tool. 4 refs., 9 figs.

  9. Introduction of research and development in Image Information Science Laboratory; Image joho kagaku kenkyusho ni okeru kenkyu kaihatsu no shokai

    NONE

    1999-10-10

    This paper introduces research and development at the Image Information Science Laboratory. This is a joint industry-university research institution for the purpose of making a computer recognize human non-language information, expressing and transmitting it, with the research conducted at two centers, Kanto and Kansai. The following studies are being made at the Kansai research center: man/machine interface making natural communication possible between a man and a machine, with emphasis placed on visual information; sensing technology for measuring human activity, technology for analyzing/forming human sensitivity, and technology of expression; technology by which a work is done by a computer in place of a man and reproduced on the computer, with the skill transferred to a man; and development of a spatial expression media system such as a three-dimensional display device. The Tokyo research center is participating in the following projects: committee for promoting joint industry-university research and development of virtual reality (VR); joint industry-university research, development and implementation project of advanced VR; survey on physiological psychological effect in VR system and the like; and research and development of human media. (NEDO)

  10. Computability and unsolvability

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  11. Mathematics for computer graphics

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  12. Computations and interaction

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  13. Symbiotic Cognitive Computing

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  14. Computer scientist looks at reliability computations

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  15. Roadmap to greener computing

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  16. Brief: Managing computing technology

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  17. Computer mathematics for programmers

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  18. Parallel computing works

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  19. The digital computer

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  20. Cloud computing for radiologists

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  1. Toward Cloud Computing Evolution

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  2. Algorithmically specialized parallel computers

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  3. Survey and research for the enhancement of large-scale technology development 1. Japan's large-scale technology development and the effects; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 1. Nippon no daikibo gijutsu kaihatsu to sono koka

    NONE

    1981-03-01

    A survey is conducted into the effects of projects implemented under the large-scale industrial technology research and development system. In the development of 'ultraperformance computers,' each of the technologies is being widely utilized, and the data service system of Nippon Telegraph and Telephone Public Corporation and the large computer (HITAC8800) owe much for their success to the fruits of the development endeavor. In the development of the 'desulfurization technology,' the fruits are in use by Tokyo Electric Power Co., Inc., and Chubu Electric Power Co., Inc., incorporated into their desulfurization systems. Although there is no practical plant based on the 'great-depth remotely controlled submarine oil drilling rig,' yet oceanic technologies and control methods are being utilized in various fields. The 'seawater desalination and by-product utilization' technologies have enabled the establishment of technologies of the top level in the world thanks to the resultant manufacture of concrete evaporator and related technologies. Eleven plants have been completed utilizing the fruits of the development. In the field of 'electric vehicle,' there is no commercialization in progress due to problems in cost effectiveness though remarkable improvement has been achieved in terms of performance. Technologies about weight reduction, semiconductor devices, battery parts and components, etc., are being utilized in many fields. (NEDO)

  4. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  5. Advanced Dynamic Anthropomorphic Manikin (ADAM) Final Design Report

    1990-03-01

    an ejection sequence, the human body is subjected to numerous dynamic loadings from the catapult and sustaining rocket, as well as from wind blast. In...ML12S33 CMF’I.B #1.33, (A6) Selection = 3? BNC.S MU2S34 No - check for 4 LEA.L F’ARMSG,A2 Else display Parity prompt MOVC..W PARCT,D2 BSFR DI.; PMSG CLR.W

  6. Adapting the ADAM Manikin Technology for Injury Probability Assessment

    1992-02-19

    Personnel Caused by Ejection from Navy Aircraft. IN: Medical- Legal Aspects of Aviation. Neuilly-sur-Seine, France, Advisory Group On Aerospace...Gierke, H.E., Kaleps, I. Blodynamic Motion and Injury Prediction. Separata Revista Portuguesa Medicina Militar, 33(1), 1-3 (1985). 926. Vulcan, A.P

  7. Simulation of Human Respiration with Breathing Thermal Manikin

    Bjørn, Erik

    The human respiration contains carbon dioxide, bioeffluents, and perhaps virus or bacteria. People may also indulge in activities that produce contaminants, as for example tobacco smoking. For these reasons, the human respiration remains one of the main contributors to contamination of the indoor...

  8. Construction requirements for full-term newborn simulation manikin

    Thielen, M.W.H.; Bovendeerd, P.H.M.; Neto Fonseca, L.T.; van der Hout-van der Jagt, M.B.

    2015-01-01

    Introduction In the Netherlands, approximately 4500 newborns are admitted each year in the Neonatal Intensive Care Unit (NICU). In order to determine and practice optimal treatment for these fragile patients, clinicians increasingly use educative simulation. However, a high-fidelity simulation of

  9. Future Computer Requirements for Computational Aerodynamics

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  10. Computers and Computation. Readings from Scientific American.

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  11. Know Your Personal Computer Introduction to Computers

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Heterotic computing: exploiting hybrid computational devices.

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. Development of regeneration technique for diesel particulate filter made of porous metal; Kinzoku takotai DPF no saisei gijutsu no kaihatsu

    Yoro, K; Ban, S; Ooka, T; Saito, H; Oji, M; Nakajima, S; Okamoto, S [Sumitomo Electric Industries, Ltd., Osaka (Japan)

    1997-10-01

    We have developed the diesel particulate filter (DPF) in which porous metal is used for a filter because of its high thermal conductivity and a radiation heater is used for a regeneration device because of its uniform thermal distribution. In the case high trapping efficiency is required, filter thickness should be thick. The thicker filter has a disadvantage of difficulty in regeneration because of the thermal distribution in the direction of thickness. In order to improve regeneration efficiency, we designed the best filter-heater construction which achieves uniform thermal distribution by using computer simulation and we confirmed good regeneration efficiency in the experiment. 4 refs., 14 figs., 1 tab.

  14. Cloud Computing for radiologists.

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  15. Cloud Computing for radiologists

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  16. Cloud computing for radiologists

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  17. Review of quantum computation

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  18. Fiscal 1999 achievement report on interuniversity coordination type research and development of industrial technology. Research and development of highly functional material designing platform; 1999 nendo kokino zairyo sekkei platform no kenkyu kaihatsu seika hokokusho. Energy shiyo gorika gijutsu kaihatsu

    NONE

    2000-03-01

    For the purpose of enabling prediction of structures and properties of polymeric materials by computer-assisted experiments, studies were conducted for putting new simulation technologies to practical application. With the coarse-graining molecular dynamic method working group, a coarse-graining molecular dynamic engine high in generality and augmentability was developed so that an optimum coarse-graining model might be chosen and constructed to achieve any given purpose, and its installation was almost completed. The dynamic mean field method working group fabricated a new general-purpose simulator with its interface simplified and function improved. A simulation technique was also developed, which would be incorporated into the general-purpose simulator. The dispersed structure simulation working group developed a group of class libraries for use in continuum simulation. The platform working group, making use of Java excellent in general-purpose feature, worked out a function verification platform to run on Windows95/98/NT. (NEDO)

  19. Development of offroad unmanned dump truck navigation system. Dump truck mujin soko system no kaihatsu ni tsuite

    Horii, Z [Nittetsu Mining Co. Ltd., Tokyo (Japan)

    1992-08-25

    A large offroad unmanned dump truck navigation system has been developed, and is in practical operation mounted on dump trucks at Torigatayama Limestone Quarry of Nittetsu Mining Company. The system functions in a manual dump truck navigation mode, wireless navigation mode, and unmanned control mode. The unmanned control mode further includes a mode to navigate the truck on a predetermined course with its data having been input in a computer and a mode that when the truck was moved on a course under a wireless control, the computer learns the course and drives the truck autonomously thereafter. The safety measures are divided into the hardware safety function to detect abnormalities in brakes and other vehicle parts, and the software safety functions of data communications, sensor action check, and prevention of collision of trucks with each other. The system has resulted in a productivity of average one-way travel distance of 345 m, and average unmanned navigation cycle time of 9 minutes and 26 seconds for a transportation efficiency of 541 t/hour/truck, having reached at least the manned operation level. 4 figs., 1 tab.

  20. Computers for imagemaking

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  1. Computer Lexis and Terminology

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  2. Computations in plasma physics

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  3. Quantum computer science

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  4. Explorations in quantum computing

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  5. Physics vs. computer science

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  6. Polymorphous computing fabric

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  7. Computer ray tracing speeds.

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  8. Computing networks from cluster to cloud computing

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  9. Computing Nash equilibria through computational intelligence methods

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  10. Reversible computing fundamentals, quantum computing, and applications

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  11. Computing in high energy physics

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  12. Searching with Quantum Computers

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.

  13. Book Review: Computational Topology

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  14. Essential numerical computer methods

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  15. Know Your Personal Computer

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  16. Computed Tomography (CT) -- Head

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  17. SSCL computer planning

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  18. Computed Tomography (CT) -- Sinuses

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  19. Computational Science Facility (CSF)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  20. Quantum Computer Science

    Mermin, N. David

    2007-08-01

    Preface; 1. Cbits and Qbits; 2. General features and some simple examples; 3. Breaking RSA encryption with a quantum computer; 4. Searching with a quantum computer; 5. Quantum error correction; 6. Protocols that use just a few Qbits; Appendices; Index.

  1. Computer Vision Syndrome.

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  2. Computed Tomography (CT) -- Sinuses

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  3. Computer Technology Directory.

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  4. My Computer Is Learning.

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  5. What is Computed Tomography?

    ... Imaging Medical X-ray Imaging What is Computed Tomography? Share Tweet Linkedin Pin it More sharing options ... Chest X ray Image back to top Computed Tomography (CT) Although also based on the variable absorption ...

  6. Joint Computing Facility

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  7. Computing for Belle

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  8. Computational Continuum Mechanics

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  9. Applications of computer algebra

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  10. ICASE Computer Science Program

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  11. Computed Tomography (CT) -- Sinuses

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  12. Computed Tomography (CT) -- Head

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  13. Computed Tomography (CT) -- Head

    Full Text Available ... Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - ...

  14. Intimacy and Computer Communication.

    Robson, Dave; Robson, Maggie

    1998-01-01

    Addresses the relationship between intimacy and communication that is based on computer technology. Discusses definitions of intimacy and the nature of intimate conversations that use computers as a communications medium. Explores implications for counseling. (MKA)

  15. Computed Tomography (CT) -- Sinuses

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  16. Cognitive Computing for Security.

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  17. Fiscal 1997 project on the R and D of industrial scientific technology under consignment from NEDO. Report on the results of the R and D of new software structuring models (R and D of micromachine cooperative control use software); 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu jigyo Shin Energy Sangyo Gijutsu Sogo Kaihatsu Kiko itaku. Shin software kozoka model no kenkyu kaihatsu (bisho kikai kyocho seigyoyo software no kenkyu kaihatsu) seika hokokusho

    NONE

    1998-03-01

    A R and D was conducted of software structuring models which ease the development and maintenance of software systems and meet diversification of needs. As for the study of the cooperative control use programming language, a R and D of agent oriented language Flage was carried out for expansion of language function, arrangement of network function, development of exercises, etc. As to the formulation of agent knowledge, proposed were processes to make a program from the specifications, and EVA, a mechanism in response to changes in the specifications of existing programs. In relation to the basic theory of cooperation system, a study was made mainly of object oriented attribute grammar OOAG as a model representing cooperative computation in software process as a rule group. Concerning the study of the situation recognition mechanism, researched were models of communication and reasoning among agents in cooperation. 187 refs., 107 figs., 23 tabs.

  18. Report on the achievements in the projects subsidized by the Sunshine Project in fiscal 1981. Data 3. Development of a coal liquefaction technology - development of a solvent extraction and liquefaction technology - 'development of a brown coal based solvent extraction plant' (Development of a 50-t/d pilot plant); 1981 nendo sekitan ekika gijutsu no kaihatsu seika hokokusho (shiryo 3). Yozai chushutsu ekika gijutsu no kaihatsu (kattankei yozai chushutsu plant no kaihatsu (50ton/nichi pilot plant no kaihatsu))

    NONE

    1982-03-01

    Developmental researches were carried out on a liquefaction plant for the Victoria brown coal produced in Australia (a 50-t/d pilot plant). In fiscal 1981, detailed design was performed on the primary hydrogenation system by using the process conception and the design data obtained in the element studies. Part of the devices was procured, and the site construction was begun. The present data is a collection of drawings in relation with the instrumentation design, such as the meter specifications, front view drawings for meter panels, drawings for panel arrangement in the central control room, a computer room layout drawing, control system explanation drawings, interlock diagrams, and the instrumentation power supply diagrams. (NEDO)

  19. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  20. Nanoelectronics: Metrology and Computation

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  1. Foundations of Neuromorphic Computing

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  2. Approximation and Computation

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  3. Computed tomography for radiographers

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  4. Quantum computing and probability.

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Quantum computing and probability

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  6. Quantum mechanics and computation

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  7. COMPUTATIONAL SCIENCE CENTER

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  8. COMPUTER GAMES AND EDUCATION

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  9. Man and computer

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  10. Computational physics an introduction

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  11. Computing environment logbook

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  12. The Computer Revolution.

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  13. Advances in physiological computing

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  14. Physics of quantum computation

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  15. Quantum walk computation

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  16. The Challenge of Computers.

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  17. Visitor's Computer Guidelines | CTIO

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  18. Medical Computational Thinking

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  19. Computed Tomography (CT) -- Head

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  20. Emission computed tomography

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  1. Computed Tomography (CT) -- Sinuses

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  2. Computed Tomography (CT) -- Head

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  3. Beyond the Computer Literacy.

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  4. Computer algebra applications

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  5. Computer-assisted instruction

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  6. Designing with computational intelligence

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  7. A new computing principle

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  8. Computers and Information Flow.

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  9. FY 1999 Report on research and development of power generation by solid electrolyte fuel cell. Research and development of solid electrolyte fuel cell; 1999 nendo nenryo denchi hatsuden gijutsu kaihatsu kotai denkaishitsugata nenryo denchi no kenkyu kaihatsu kenkyu seika

    NONE

    2000-06-01

    This project is aimed at establishment of the module basic technology and commercialization of the solid electrolyte fuel cell in the early stage by designing, construction, operation and performance evaluation of a several kW-class module which incorporates the cylindrical cell fabricated by the wet process. The FY 1999 R and D efforts include (1) cell performance demonstration study: the cylindrical single cell fabricated by the wet process is demonstration-tested to determine the initial performance and durability for continuous operation, thereby comparing the external reforming with internal reforming in output, with the internal reforming rate as the parameter, (2) development of a several kW-class module: the adequate cell arrangement structure within the module is studied by the computer-aided simulation, and the tests for confirming thermal cycle durability of the modified bundle are conducted using the module power generation unit and the several kW-class module is tested, and (3) development of the technology for designing a thermally supported module: the effects of, e.g., air and fuel supply conditions on the module performance are analyzed using the analytical model as the base. Expansion of the module level to the process simulation model has been completed, based on these results. (NEDO)

  10. Achievement report for fiscal 1998. Research and development of synergy ceramics (Research and development of anti-corrosion technologies for oil production system); 1998 nendo seika hokokusho. Shinaji ceramics no kenkyu kaihatsu (sekiyu seisan system fushoku boshi gijutsu kenkyu kaihatsu)

    NONE

    1999-03-01

    A computer simulation-aided designing technology is completed to help develop synergy ceramics which are complicated in phase constitution and material structure. A program, which simulates sintering and particle growth for multiple solid-state and solid-state/liquid-phase systems, has been built assuming the form of an integrated micro/nano level simulation technology based on the Monte Carlo method. The new program applies to systems consisting of more than three different phases, and deals in a uniform way with the designing of various textures formable by transfer of matters in solid-state particle growth, Ostwalt ripening, solid-state sintering, liquid-phase sintering, and additive reaction. The Monte Carlo method-assisted program is applied to systems based on AlN, Si{sub 3}N{sub 4}, and Al{sub 2}O{sub 3}, and a good result is achieved in each case. The molecular dynamic method is mainly used in atom level simulation for application to ZrO{sub 2}-Y{sub 2}O{sub 3} and Si-C-N, and a good result is achieved in each case. (NEDO)

  11. FY 1998 annual report on the development of fuel cell power generation techniques. Research and development of solid electrolyte fuel cells (research results); 1998 nendo nenryo denchi gijutsu kaihatsu. Kotai denkaishitsugata nenryo denchi no kenkyu kaihatsu

    NONE

    1999-03-01

    Described herein are the FY 1998 research and development results of solid electrolyte fuel cells. For R and D of the tubular type cell by the wet processing technique, the tests are conducted to evaluate the initial performance and long-term durability for continuous operation of the single tubular cell. For development of the several-kW class modules, computer-aided simulations are conducted. For R and D of material and substrate techniques, the thermal cycle characteristics, cell characteristics and stress of the cell modules are evaluated, in order to evaluate their reliability. The thermal cycle test results indicate that performance of the single-stage cell is unaffected by the thermal cycles. It is found by the stress evaluation that use of the separator plate having a higher thermal expansion coefficient than the electrolyte plate and use of the sealant having a thermal expansion coefficient close to that of the electrolyte plate are effective means to reduce stresses. For the research to reduce costs of the cell materials, their chemical, mechanical and thermal characteristics are evaluated. For the system research, the areas for which the compact systems are suitable and their optimization are studied. (NEDO)

  12. Computer naratology: narrative templates in computer games

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  13. Neural Computation and the Computational Theory of Cognition

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  14. Quantum computing and spintronics

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  15. Theory of computation

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  16. Computer Security Handbook

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  17. Secure cloud computing

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  18. Scalable optical quantum computer

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  19. Computing meaning v.4

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  20. Scalable optical quantum computer

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  1. COMPUTATIONAL SCIENCE CENTER

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  2. Computer algebra and operators

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  3. Cloud Computing Bible

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  4. Design of Computer Experiments

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  5. Computer in radiology

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  6. Programming in biomolecular computation

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  7. Computer assisted radiology

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  8. Fiscal 2000 achievement report. Welfare technosystem research and development (Ube); 2000 nendo walfare technosystem kenkyu kaihatsu (Ube) seika hokokusho

    NONE

    2001-03-01

    Efforts continue to develop welfare nursing equipment for helping persons in need of nursing care or handicapped persons who live an independent life. Efforts are exerted in three fields for developing (1) a motor driven wheelchair in which the occupant feels as if driving a car, (2) an online cardiac function monitoring system using a cardiac sound sensor, (3) an automatic pressure belt for patients of postural low blood pressure, (4) and a seated position aiding and stabilizing device for workers with legs not fully useful. In field (1), the steering wheel, brake, and driving system are contrived so that the occupant may easily drive the wheelchair as if in an automobile driver's seat. In field (2), a device is developed to monitor and examine the heart rate and respiration rate, for which a cardiac sensor has to be installed on the water bed or an air mattress. In field (3), a computer controlled automatic pressure belt is developed for patients of myelopathy or cerebrovascular disease. In field (4), a work assisting device is developed to help seated workers with leg troubles. (NEDO)

  9. Fiscal 1998 achievement report on welfare technosystem research and development. Kyoto; 1998 nendo walfare technosystem kenkyu kaihatsu (Kyoto) seika hokokusho

    NONE

    1999-03-01

    To be ready for the computing-everywhere age to come, it is necessary to create environments in which the barrier-free utilization is ensured of various apparatuses of the daily routine, such as various bodily function substituting apparatuses capable of compensating for the degraded functions of disabled or aged individuals. Under the circumstances, a system for building man-machine interfaces in the home or the like is required, and fundamental technologies of architecture and information infrastructure have to be established on which the development of technologies for the field involved will proceed. Concerning the technologies already in existence in this field of research, a survey is conducted of the status of research and development of information interface techniques primarily at Stanford University, and a report is made thereon. Also reported is the information obtained at Technology and Persons with Disabilities Conference 1999. The result of a survey of the trend of research and development of the smart house under the TIDE (Technology Initiative for Disabled and Elderly People) project and the result of a survey of an information standardization project for equipment control in Europe are reported, and the result of a survey of the approach of Kyoto's welfare apparatus distributors to the development of equipment is made known. (NEDO)

  10. DCE. Future IHEP's computing environment

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  11. Natural Computing in Computational Finance Volume 4

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  12. Computational Biology and High Performance Computing 2000

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  13. COMPUTER-ASSISTED ACCOUNTING

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  14. Quantum analogue computing.

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  15. Fiscal 1997 report on the R and D under consignment from NEDO on human sensory measurement application technology; 1997 nendo ningen kankaku keisoku oyo gijutsu no kenkyu kaihatsu itaku kenkyu seika hokokusho

    NONE

    1998-03-01

    The paper outlined the result of the fiscal 1997 R and D on `human sensory measurement application technology (HSMAT)` which entered into the second stage. As to the R and D of technology for development of human sensory indices, examples for developing human sensory indices were set up for the following each technology to be developed, and measuring experiments were conducted: technology to assess the effect on physiological senses such as fatigue and awakening, technology to assess human adaptability to various environmental conditions, and technology to assess adaptability of products to humans from a viewpoint of affinity, etc. In relation to the R and D of technology for practical application of human sensory indices, a study was conducted of application examples for using each index to the design of daily products and residential/working environments, and at the same time a prototype database of human sensory data was trially made using part of the experimental data. Moreover, for clothes and working environment to feel better, design/assembly were conducted of a hand/leg movable and sit-on-chair manikin which enables rational estimation of thermal sense of humans. 82 refs., 391 figs., 88 tabs.

  16. COMPUTATIONAL SCIENCE CENTER

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  17. COMPUTATIONAL SCIENCE CENTER

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  18. FY 1998 Report on development of large-scale wind power generation systems. Feasibility study on development of new technologies for wind power generation (Study on the development of wind power generation); 1998 nendo ogata furyoku hatsuden system kaihatsu. Furyoku hatsuden shingijutsu kaihatsu kanosei chosa (furyoku hatsuden gijutsu ni kansuru kaihatsu doko chosa)

    NONE

    1999-03-01

    This survey is designed to analyze, e.g., current status of large-scale wind power generation devices/system technologies and development trends worldwide, and to make predictions about future developments, in an effort to contribute to advancements in new technology for wind power generation systems in Japan. The international R and D cooperation programs promoted by IEA and EU have helped the participants produce a number of good results at lower costs. The European countries have developed the wind power generation industries in each area, promoted by the governmental subsidy policies, and are leading the world. The system is becoming larger, from around an average unit capacity of 250kW in the beginning of the 90's to 600kW now, reducing the cost by the scale merit. The improved computer capacity has made it possible to more easily analyze the complicated rotor aerodynamics, structural dynamics, wind characteristics and other factors related to wind power generation systems. The future R and D directions will include world standards for large-scale wind turbines, advancements in wind farm technologies, offshore wind power generation systems, advancement in design technologies, and new concepts for wind power turbine designs, e.g., floating wind turbine. (NEDO)

  19. Fiscal 1997 report on the results of the R and D of industrial scientific technology. R and D of synergistic ceramics (R and D of corrosion prevention technology for the petroleum production system); 1997 nendo sangyo kagaku gijutsu kenkyu kaihatsu seika hokokusho. Synergy ceramics no kenkyu kaihatsu (sekiyu seisan system fushoku boshi gijutsu kenkyu kaihatsu)

    NONE

    1998-03-01

    To heighten durability and safety of materials/parts for undersea oil drilling, the development of ceramic base materials was made by developing function harmony type process technology which harmonizes on a high grade contrary characteristics and various functions. The paper summed up the fiscal 1997 results. In the design of system formation, computational simulation technology was developed to the composite process and the diploid system. The development of multifunction simultaneous manifestation materials was trially made by the higher nano structure process. A study was made of control of microstructures of porous materials and matrix filling by the gas phase precipitation control. Proposed were selective control of grain growth from species crystals and the columnar particle orientation laminated structure of simultaneous manifestation of strength and toughness. By composite precipitation reaction control, studied were simultaneous dispersion of whisker and increase in density of matrixes, and harmonization with long fibers. Silicon nitride was trially made with low lubrication/friction coefficients and high strength. A simulation method for crack progress behavior evaluation was developed using a testing notched specimen heterogeneous microstructures. Analyses were made of brittle fracture mechanics and reliability evaluation. 273 refs., 344 figs., 29 tabs.

  20. Computation as Medium

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  1. Introduction to morphogenetic computing

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  2. The CMS Computing Model

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  3. Introduction to reversible computing

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  4. Tracking and computing

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  5. Computing at Belle II

    Kuhr, Thomas

    2012-01-01

    Belle II, a next-generation B-factory experiment, will search for new physics effects in a data sample about 50 times larger than the one collected by its predecessor, the Belle experiment. To match the advances in accelerator and detector technology, the computing system and the software have to be upgraded as well. The Belle II computing model is presented and an overview of the distributed computing system and the offline software framework is given.

  6. Computing Conference at Bologna

    Anon.

    1980-01-01

    From 9-12 September a Europhysics Conference on Computing in High Energy and Nuclear Physics, organized by the Computational Physics Group of the European Physical Society, was held in Bologna, attracting some 150 participants. Its purpose was contact and exchange of information between experimental physicists (from both fields of research) and computer experts (on whom the successful outcome of the research has become increasingly dependent)

  7. Review on Computational Electromagnetics

    P. Sumithra

    2017-03-01

    Full Text Available Computational electromagnetics (CEM is applied to model the interaction of electromagnetic fields with the objects like antenna, waveguides, aircraft and their environment using Maxwell equations.  In this paper the strength and weakness of various computational electromagnetic techniques are discussed. Performance of various techniques in terms accuracy, memory and computational time for application specific tasks such as modeling RCS (Radar cross section, space applications, thin wires, antenna arrays are presented in this paper.

  8. CAD on personal computers

    Lee, Seong U; Cho, Cheol Ho; Ko, Il Du

    1990-02-01

    This book contains four studies of CAD on personal computers. The first thing is computer graphics in computer-aided design by Seong U Lee. The second thing is graphics primer and programming with Fortran by Seong U Lee. The third thing is application of Auto cad by Il Do Ko. The last thing is application of CAD in building construction design by Cheol Ho Cho.

  9. Computational movement analysis

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  10. Computational neurogenetic modeling

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  11. Research in computer forensics

    Wai, Hor Cheong

    2002-01-01

    Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

  12. Research in computer science

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  13. Computer information systems framework

    Shahabuddin, S.

    1989-01-01

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  14. Human Computer Music Performance

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  15. Intelligent distributed computing

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  16. Genomics With Cloud Computing

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  17. Parallel computing works!

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  18. Computer science I essentials

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  19. Discrete computational structures

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  20. Cloud Computing: An Overview

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  1. Computational mathematics in China

    Shi, Zhong-Ci

    1994-01-01

    This volume describes the most significant contributions made by Chinese mathematicians over the past decades in various areas of computational mathematics. Some of the results are quite important and complement Western developments in the field. The contributors to the volume range from noted senior mathematicians to promising young researchers. The topics include finite element methods, computational fluid mechanics, numerical solutions of differential equations, computational methods in dynamical systems, numerical algebra, approximation, and optimization. Containing a number of survey articles, the book provides an excellent way for Western readers to gain an understanding of the status and trends of computational mathematics in China.

  2. Multidisciplinary Computational Research

    Visbal, Miguel R

    2006-01-01

    The purpose of this work is to develop advanced multidisciplinary numerical simulation capabilities for aerospace vehicles with emphasis on highly accurate, massively parallel computational methods...

  3. Frontiers in Computer Education

    Zhu, Egui; 2011 International Conference on Frontiers in Computer Education (ICFCE 2011)

    2012-01-01

    This book is the proceedings of the 2011 International Conference on Frontiers in Computer Education (ICFCE 2011) in Sanya, China, December 1-2, 2011. The contributions can be useful for researchers, software engineers, and programmers, all interested in promoting the computer and education development. Topics covered are computing and communication technology, network management, wireless networks, telecommunication, Signal and Image Processing, Machine Learning, educational management, educational psychology, educational system, education engineering, education technology and training.  The emphasis is on methods and calculi for computer science and education technology development, verification and verification tools support, experiences from doing developments, and the associated theoretical problems.

  4. Computers appreciated by marketers

    Mantho, M.

    1993-01-01

    The computer has been worth its weight in gold to the fueloil man. In fact, with falling prices on both software and machines, the worth is greater than gold. Every so often, about every three years, we ask some questions about the utilization of computers. This time, we looked into the future, to find out the acceptance of other marvels such as the cellular phone and hand held computer. At the moment, there isn't much penetration. Contact by two-way radio as well as computing meters on trucks still reign supreme

  5. Genomics With Cloud Computing

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  6. Computer Games and Art

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  7. Computers in engineering. 1988

    Tipnis, V.A.; Patton, E.M.

    1988-01-01

    These proceedings discuss the following subjects: Knowledge base systems; Computers in designing; uses of artificial intelligence; engineering optimization and expert systems of accelerators; and parallel processing in designing

  8. Numbers and computers

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  9. Octopus: LLL's computing utility

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  10. Theory and Computation

    Federal Laboratory Consortium — Flexible computational infrastructure, software tools and theoretical consultation are provided to support modeling and understanding of the structure and properties...

  11. Educational Computer Utilization and Computer Communications.

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  12. Computer Aided Mathematics

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  13. Computer Network Operations Methodology

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  14. Classroom Computer Network.

    Lent, John

    1984-01-01

    This article describes a computer network system that connects several microcomputers to a single disk drive and one copy of software. Many schools are switching to networks as a cheaper and more efficient means of computer instruction. Teachers may be faced with copywriting problems when reproducing programs. (DF)

  15. Hypercard Another Computer Tool.

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  16. Can Computers See?

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 6. Can Computers See? - Can Computers Understand Visual Data? Neelima Shrikhande. General Article Volume 4 Issue 6 June 1999 pp 45-56. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Computational genomics of hyperthermophiles

    Werken, van de H.J.G.

    2008-01-01

    With the ever increasing number of completely sequenced prokaryotic genomes and the subsequent use of functional genomics tools, e.g. DNA microarray and proteomics, computational data analysis and the integration of microbial and molecular data is inevitable. This thesis describes the computational

  18. Computer Technology for Industry

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  19. Computers in construction

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  20. Computer Use Exposed

    J.M. Richter (Janneke)

    2009-01-01

    textabstractEver since the introduction of the personal computer, our daily lives are infl uenced more and more by computers. A day in the life of a PhD-student illustrates this: “At the breakfast table, I check my e-mail to see if the meeting later that day has been confi rmed, and I check the time

  1. Can Computers Create?

    Hausman, Carl R.

    1985-01-01

    To be creative, an act must have as its outcome something new in the way it is intelligible and valuable. Computers have restricted contexts of information and have no ability to weigh bits of information. Computer optimists presuppose either determinism or indeterminism, either of which abandons creativity. (MT)

  2. Personalized Empathic Computing (PEC)

    van Beusekom, W.; van den Broek, Egon; van der Heijden, M.; Janssen, J.H.; Spaak, E.

    2006-01-01

    Until a decade ago, computers were only used by experts, for professional purposes solely. Nowadays, the personal computer (PC) is standard equipment in most western housekeepings and is used to gather information, play games, communicate, etc. In parallel, users' expectations increase and,

  3. Computers and Creativity.

    Ten Dyke, Richard P.

    1982-01-01

    A traditional question is whether or not computers shall ever think like humans. This question is redirected to a discussion of whether computers shall ever be truly creative. Creativity is defined and a program is described that is designed to complete creatively a series problem in mathematics. (MP)

  4. Petascale Computational Systems

    Bell, Gordon; Gray, Jim; Szalay, Alex

    2007-01-01

    Computational science is changing to be data intensive. Super-Computers must be balanced systems; not just CPU farms but also petascale IO and networking arrays. Anyone building CyberInfrastructure should allocate resources to support a balanced Tier-1 through Tier-3 design.

  5. Computer Software Reviews.

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  6. Emission computed tomography

    Budinger, T.F.; Gullberg, G.T.; Huesman, R.H.

    1979-01-01

    This chapter is devoted to the methods of computer assisted tomography for determination of the three-dimensional distribution of gamma-emitting radionuclides in the human body. The major applications of emission computed tomography are in biological research and medical diagnostic procedures. The objectives of these procedures are to make quantitative measurements of in vivo biochemical and hemodynamic functions

  7. Computers in writing instruction

    Schwartz, Helen J.; van der Geest, Thea; Smit-Kreuzen, Marlies

    1992-01-01

    For computers to be useful in writing instruction, innovations should be valuable for students and feasible for teachers to implement. Research findings yield contradictory results in measuring the effects of different uses of computers in writing, in part because of the methodological complexity of

  8. Nature, computation and complexity

    Binder, P-M; Ellis, G F R

    2016-01-01

    The issue of whether the unfolding of events in the world can be considered a computation is explored in this paper. We come to different conclusions for inert and for living systems (‘no’ and ‘qualified yes’, respectively). We suggest that physical computation as we know it exists only as a tool of complex biological systems: us. (paper)

  9. Computational Sociolinguistics: A Survey

    Nguyen, Dong-Phuong; Doğruöz, A. Seza; Rosé, Carolyn P.; de Jong, Franciska M.G.

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  10. Fault tolerant computing systems

    Randell, B.

    1981-01-01

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  11. Theory and computational science

    Durham, P.

    1985-01-01

    The theoretical and computational science carried out at the Daresbury Laboratory in 1984/5 is detailed in the Appendix to the Daresbury Annual Report. The Theory, Computational Science and Applications Groups, provide support work for the experimental projects conducted at Daresbury. Use of the FPS-164 processor is also described. (U.K.)

  12. Selecting Personal Computers.

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  13. Physicist or computer specialist?

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  14. Theory of computational complexity

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  15. Computer vision for sports

    Thomas, Graham; Gade, Rikke; Moeslund, Thomas B.

    2017-01-01

    fixed to players or equipment is generally not possible. This provides a rich set of opportunities for the application of computer vision techniques to help the competitors, coaches and audience. This paper discusses a selection of current commercial applications that use computer vision for sports...

  16. Basic principles of computers

    Royal, H.D.; Parker, J.A.; Holmen, B.L.

    1988-01-01

    This chapter presents preliminary concepts of computer operations. It describes the hardware used in a nuclear medicine computer system. It discusses the software necessary for acquisition and analysis of nuclear medicine studies. The chapter outlines the integrated package of hardware and software that is necessary to perform specific functions in nuclear medicine

  17. Teaching Using Computer Games

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  18. Text understanding for computers

    Kenter, T.M.

    2017-01-01

    A long-standing challenge for computers communicating with humans is to pass the Turing test, i.e., to communicate in such a way that it is impossible for humans to determine whether they are talking to a computer or another human being. The field of natural language understanding — which studies

  19. Advances in Computer Entertainment.

    Nijholt, Antinus; Romão, T.; Reidsma, Dennis; Unknown, [Unknown

    2012-01-01

    These are the proceedings of the 9th International Conference on Advances in Computer Entertainment ACE 2012). ACE has become the leading scientific forum for dissemination of cutting-edge research results in the area of entertainment computing. Interactive entertainment is one of the most vibrant

  20. Computers and Classroom Culture.

    Schofield, Janet Ward

    This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…

  1. Computer Literacy Education

    1989-01-01

    Cognitive Aspect ," AEDS Journal, 18, 3 (Spring 1985) 150. "°Geoffrey Akst, "Computer Literacy: An Interview with Dr. Michael Hoban." Journal of Develop- m...1984. Cheng, Tina T.; Plake, Barbara; and Stevens, Dorothy Jo. "A Validation Study of the Computer Literacy Examination: Cognitive Aspect ." AEDS

  2. Ubiquitous human computing.

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  3. Learning with Ubiquitous Computing

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  4. Quantum Analog Computing

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  5. Computing in Research.

    Ashenhurst, Robert L.

    The introduction and diffusion of automatic computing facilities during the 1960's is reviewed; it is described as a time when research strategies in a broad variety of disciplines changed to take advantage of the newfound power provided by the computer. Several types of typical problems encountered by researchers who adopted the new technologies,…

  6. Computational Cognitive Color Perception

    Ciftcioglu, O.; Bittermann, M.S.

    2016-01-01

    Comprehension of aesthetical color characteristics based on a computational model of visual perception and color cognition are presented. The computational comprehension is manifested by the machine’s capability of instantly assigning appropriate colors to the objects perceived. They form a scene

  7. Thinking about computational thinking

    Lu, J.J.; Fletcher, G.H.L.; Fitzgerald, S.; Guzdial, M.; Lewandowski, G.; Wolfman, S.A.

    2009-01-01

    Jeannette Wing's call for teaching Computational Thinking (CT) as a formative skill on par with reading, writing, and arithmetic places computer science in the category of basic knowledge. Just as proficiency in basic language arts helps us to effectively communicate and in basic math helps us to

  8. Computer Operating System Maintenance.

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  9. Computational Sociolinguistics: A Survey.

    de Jong, F.M.G.; Nguyen, Dong

    2016-01-01

    Language is a social phenomenon and variation is inherent to its social nature. Recently, there has been a surge of interest within the computational linguistics (CL) community in the social dimension of language. In this article we present a survey of the emerging field of “computational

  10. Simulation of quantum computers

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  11. Simulation of quantum computers

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  12. Exercises in Computational Chemistry

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  13. The Computational Materials Repository

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...

  14. Programming in biomolecular computation

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...

  15. Place-Specific Computing

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  16. Neuroscience, brains, and computers

    Giorno Maria Innocenti

    2013-07-01

    Full Text Available This paper addresses the role of the neurosciences in establishing what the brain is and how states of the brain relate to states of the mind. The brain is viewed as a computational deviceperforming operations on symbols. However, the brain is a special purpose computational devicedesigned by evolution and development for survival and reproduction, in close interaction with theenvironment. The hardware of the brain (its structure is very different from that of man-made computers.The computational style of the brain is also very different from traditional computers: the computationalalgorithms, instead of being sets of external instructions, are embedded in brain structure. Concerningthe relationships between brain and mind a number of questions lie ahead. One of them is why andhow, only the human brain grasped the notion of God, probably only at the evolutionary stage attainedby Homo sapiens.

  17. Parallelism in matrix computations

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  19. Computational models of neuromodulation.

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  20. Computational Ocean Acoustics

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...