WorldWideScience

Sample records for consequence computer based

  1. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  2. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  3. [Genotoxic modification of nucleic acid bases and biological consequences of it. Review and prospects of experimental and computational investigations

    Science.gov (United States)

    Poltev, V. I.; Bruskov, V. I.; Shuliupina, N. V.; Rein, R.; Shibata, M.; Ornstein, R.; Miller, J.

    1993-01-01

    The review is presented of experimental and computational data on the influence of genotoxic modification of bases (deamination, alkylation, oxidation) on the structure and biological functioning of nucleic acids. Pathways are discussed for the influence of modification on coding properties of bases, on possible errors of nucleic acid biosynthesis, and on configurations of nucleotide mispairs. The atomic structure of nucleic acid fragments with modified bases and the role of base damages in mutagenesis and carcinogenesis are considered.

  4. A computational model of selection by consequences.

    OpenAIRE

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...

  5. A Computational Model of Selection by Consequences

    Science.gov (United States)

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  6. Positron Emission Tomography/Computed Tomography Imaging of Residual Skull Base Chordoma Before Radiotherapy Using Fluoromisonidazole and Fluorodeoxyglucose: Potential Consequences for Dose Painting

    Energy Technology Data Exchange (ETDEWEB)

    Mammar, Hamid, E-mail: hamid.mammar@unice.fr [Radiation Oncology Department, Antoine Lacassagne Center, Nice (France); CNRS-UMR 6543, Institute of Developmental Biology and Cancer, University of Nice Sophia Antipolis, Nice (France); Kerrou, Khaldoun; Nataf, Valerie [Department of Nuclear Medicine and Radiopharmacy, Tenon Hospital, and University Pierre et Marie Curie, Paris (France); Pontvert, Dominique [Proton Therapy Center of Orsay, Curie Institute, Paris (France); Clemenceau, Stephane [Department of Neurosurgery, Pitie-Salpetriere Hospital, Paris (France); Lot, Guillaume [Department of Neurosurgery, Adolph De Rothschild Foundation, Paris (France); George, Bernard [Department of Neurosurgery, Lariboisiere Hospital, Paris (France); Polivka, Marc [Department of Pathology, Lariboisiere Hospital, Paris (France); Mokhtari, Karima [Department of Pathology, Pitie-Salpetriere Hospital, Paris (France); Ferrand, Regis; Feuvret, Loiec; Habrand, Jean-louis [Proton Therapy Center of Orsay, Curie Institute, Paris (France); Pouyssegur, Jacques; Mazure, Nathalie [CNRS-UMR 6543, Institute of Developmental Biology and Cancer, University of Nice Sophia Antipolis, Nice (France); Talbot, Jean-Noeel [Department of Nuclear Medicine and Radiopharmacy, Tenon Hospital, and University Pierre et Marie Curie, Paris (France)

    2012-11-01

    Purpose: To detect the presence of hypoxic tissue, which is known to increase the radioresistant phenotype, by its uptake of fluoromisonidazole (18F) (FMISO) using hybrid positron emission tomography/computed tomography (PET/CT) imaging, and to compare it with the glucose-avid tumor tissue imaged with fluorodeoxyglucose (18F) (FDG), in residual postsurgical skull base chordoma scheduled for radiotherapy. Patients and Methods: Seven patients with incompletely resected skull base chordomas were planned for high-dose radiotherapy (dose {>=}70 Gy). All 7 patients underwent FDG and FMISO PET/CT. Images were analyzed qualitatively by visual examination and semiquantitatively by computing the ratio of the maximal standardized uptake value (SUVmax) of the tumor and cerebellum (T/C R), with delineation of lesions on conventional imaging. Results: Of the eight lesion sites imaged with FDG PET/CT, only one was visible, whereas seven of nine lesions were visible on FMISO PET/CT. The median SUVmax in the tumor area was 2.8 g/mL (minimum 2.1; maximum 3.5) for FDG and 0.83 g/mL (minimum 0.3; maximum 1.2) for FMISO. The T/C R values ranged between 0.30 and 0.63 for FDG (median, 0.41) and between 0.75 and 2.20 for FMISO (median,1.59). FMISO T/C R >1 in six lesions suggested the presence of hypoxic tissue. There was no correlation between FMISO and FDG uptake in individual chordomas (r = 0.18, p = 0.7). Conclusion: FMISO PET/CT enables imaging of the hypoxic component in residual chordomas. In the future, it could help to better define boosted volumes for irradiation and to overcome the radioresistance of these lesions. No relationship was founded between hypoxia and glucose metabolism in these tumors after initial surgery.

  7. Evidence for lower variability of coronary artery calcium mineral mass measurements by multi-detector computed tomography in a community-based cohort-Consequences for progression studies

    International Nuclear Information System (INIS)

    Hoffmann, Udo; Siebert, Uwe; Bull-Stewart, Arabella; Achenbach, Stephan; Ferencik, Maros; Moselewski, Fabian; Brady, Thomas J.; Massaro, Joseph M.; O'Donnell, Christopher J.

    2006-01-01

    Purpose: To compare the measurement variability for coronary artery calcium (CAC) measurements using mineral mass compared with a modified Agatston score (AS) or volume score (VS) with multi-detector CT (MDCT) scanning, and to estimate the potential impact of these methods on the design of CAC progression studies. Materials and methods: We studied 162 consecutive subjects (83 women, 79 men, mean age 51 ± 11 years) from a general Caucasian community-based cohort (Framingham Heart Study) with duplicate runs of prospective electrocardiographically-triggered MDCT scanning. Each scan was independently evaluated for the presence of CAC by four experienced observers who determined a 'modified' AS, VS and mineral mass. Results: Of the 162 subjects, CAC was detected in both scans in 69 (42%) and no CAC was detected in either scan in 72 (45%). Calcium scores were low in the 21/162 subjects (12%) for whom CAC was present in one but not the other scan (modified AS 0.96). However, the mean interscan variability was significantly different between mineral mass, modified AS, and VS (coefficient of variation 26 ± 19%, 41 ± 28% and 34 ± 25%, respectively; p < 0.04), with significantly smaller mean differences in pair-wise comparisons for mineral mass compared with modified AS (p < 0.002) or with VS (p < 0.03). The amount of CAC but not heart rate was an independent predictor of interscan variability (r = -0.638, -0.614 and -0.577 for AS, VS, and mineral mass, respectively; all p < 0.0001). The decreased interscan variability of mineral mass would allow a sample size reduction of 5.5% compared with modified AS for observational studies of CAC progression and for randomized clinical trials. Conclusion: There is significantly reduced interscan variability of CAC measurements with mineral mass compared with the modified AS or VS. However, the measurement variability of all quantification methods is predicted by the amount of CAC and is inversely correlated to the extent of partial

  8. Restructuring of schools as a consequence of computer use?

    NARCIS (Netherlands)

    Plomp, T.; Pelgrum, W.J.

    1993-01-01

    The central question discussed is whether the use of computers leads to the restructuring of schools or classrooms. Several authors argue that intensive use of computers must lead to new classroom patterns or new forms of schooling. Data from the international comparative study of computers in

  9. Computer Based Expert Systems.

    Science.gov (United States)

    Parry, James D.; Ferrara, Joseph M.

    1985-01-01

    Claims knowledge-based expert computer systems can meet needs of rural schools for affordable expert advice and support and will play an important role in the future of rural education. Describes potential applications in prediction, interpretation, diagnosis, remediation, planning, monitoring, and instruction. (NEC)

  10. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  11. Subjective and Objective Work-Based Identity Consequences

    NARCIS (Netherlands)

    Botha, F.C.; Roodt, G.; van de Bunt-Kokhuis, S.G.M.; Jansen, P.G.W.; Roodt, G.

    2015-01-01

    The aim of this chapter is to provide a systematic literature review on the selected consequences of work-based identity (WI). The first section of the chapter includes the following subjective consequences: self-report measures on personal alienation, helping behaviour (H-OCB), burnout (consisting

  12. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.H.P.

    2015-01-01

    Let $A$ be a Dedekind domain, $K$ the fraction field of $A$, and $f\\in A[x]$ a monic irreducible separable polynomial. For a given non-zero prime ideal $\\mathfrak{p}$ of $A$ we present in this paper a new method to compute a $\\mathfrak{p}$-integral basis of the extension of $K$ determined by $f$.

  13. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  14. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  15. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  16. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  17. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  18. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  19. Guide for licensing evaluations using CRAC2: A computer program for calculating reactor accident consequences

    International Nuclear Information System (INIS)

    White, J.E.; Roussin, R.W.; Gilpin, H.

    1988-12-01

    A version of the CRAC2 computer code applicable for use in analyses of consequences and risks of reactor accidents in case work for environmental statements has been implemented for use on the Nuclear Regulatory Commission Data General MV/8000 computer system. Input preparation is facilitated through the use of an interactive computer program which operates on an IBM personal computer. The resulting CRAC2 input deck is transmitted to the MV/8000 by using an error-free file transfer mechanism. To facilitate the use of CRAC2 at NRC, relevant background material on input requirements and model descriptions has been extracted from four reports - ''Calculations of Reactor Accident Consequences,'' Version 2, NUREG/CR-2326 (SAND81-1994) and ''CRAC2 Model Descriptions,'' NUREG/CR-2552 (SAND82-0342), ''CRAC Calculations for Accident Sections of Environmental Statements, '' NUREG/CR-2901 (SAND82-1693), and ''Sensitivity and Uncertainty Studies of the CRAC2 Computer Code,'' NUREG/CR-4038 (ORNL-6114). When this background information is combined with instructions on the input processor, this report provides a self-contained guide for preparing CRAC2 input data with a specific orientation toward applications on the MV/8000. 8 refs., 11 figs., 10 tabs

  20. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Model description. NUDOS: A computer program for assessing the consequences of airborne releases of radionuclides

    International Nuclear Information System (INIS)

    Poley, A.D.

    1996-02-01

    NUDOS is a computer program that can be used to evaluate the consequences of airborne releases of radioactive material. The consequences which can be evaluated are individual dose and associated radiological risk, collective dose and the contamination of land. The code is capable of dealing with both continuous (routine) and accidental releases. For accidental releases both deterministic and probabilistic calculations can be performed, and the impact and effectiveness of emergency actions can be evaluated. This report contains a description of the models contained in NUDOS92 and the recommended values for the input parameters of these models. Additionally, a short overview is given of the future model improvement planned for the next NUDOS-version. (orig.)

  3. Ammonia-based quantum computer

    International Nuclear Information System (INIS)

    Ferguson, Andrew J.; Cain, Paul A.; Williams, David A.; Briggs, G. Andrew D.

    2002-01-01

    We propose a scheme for quantum computation using two eigenstates of ammonia or similar molecules. Individual ammonia molecules are confined inside fullerenes and used as two-level qubit systems. Interaction between these ammonia qubits takes place via the electric dipole moments, and in particular we show how a controlled-NOT gate could be implemented. After computation the qubit is measured with a single-electron electrometer sensitive enough to differentiate between the dipole moments of different states. We also discuss a possible implementation based on a quantum cellular automaton

  4. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  5. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  6. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  7. A risk standard based on societal cost with bounded consequences

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1982-01-01

    A risk standard is proposed that relates the frequency of occurrence of single events to the consequences of the events. Maximum consequences and risk aversion are used to give the cumulative risk curve a shape similar to the results of a risk assessment and to bound the expectation of deaths. Societal costs in terms of deaths are used to fix the parameters of the model together with an approximate comparison with individual risks. The proposed standard is compared with some practical applications of risk assessment to nuclear reactor systems

  8. How do we communicate stereotypes? Linguistic bases and inferential consequences

    NARCIS (Netherlands)

    Wigboldus, DHJ; Semin, GR; Spears, R

    The linguistic expectancy bias is defined as the tendency to describe expectancy-consistent information at a higher level of abstraction than expectancy-inconsistent information; The communicative consequences of this bias were examined in 3 experiments. Analyses of judgments that recipients made on

  9. User guide programmer's reference. NUDOS: A computer programme for assessing the consequences of airborne releases of radionuclides

    International Nuclear Information System (INIS)

    Grupa, J.

    1996-10-01

    NUDOS is a computer program that can be used to evaluate the consequences of airborne releases of radioactive materials. The consequences evaluated are individual dose and associated radiological risk, collective dose and the contamination of land. The code is capable of dealing with both routine and accidental releases. For accidental releases both deterministic and probabilistic calculations can be performed and the impact and effectiveness of emergency actions can be evaluated. (orig.)

  10. ARANO - a computer program for the assessment of radiological consequences of atmospheric radioactive releases

    International Nuclear Information System (INIS)

    Savolainen, I.; Vuori, S.

    1980-09-01

    A short description of the calculation possibilities, methods and of the structure of the computer code system ARANO is given, in addition to the input quide. The code can be employed in the calculation of environmental radiological consequences caused by radioactive materials released to atmosphere. Results can be individual doses for different organs at given distances from the release point, collective doses, numbers of persons exceeding given dose limits, numbers of casualties, areas polluted by deposited activity and losses of investments or production due to radioactive contamination. Both a case with a single release and atmospheric dispersion situation and a group of radioactive release and dispersions with discrete probability distributions can be considered. If the radioactive releases or the dispersion conditions are described by probability distributions, the program assesses the magnitudes of the specified effects in all combinations of the release and dispersion situations and then calculates the expectation values and the cumulative probability distributions of the effects. The vertical mixing in the atmosphere is described with a Ksub(Z)-model. In the lateral direction the plume is assumed to be Gaussian, and the release duration can be taken into account in the σsub(y)-values. External gamma dose from the release plume is calculated on the basis of a data file which has been created by 3-dimensional integration. Dose due to inhalation and due to gamma radiation from the contaminated ground are calculated by using appropriate dose conversion factors, which are collected into two mutually alternative block data subprograms. (author)

  11. Excessive computer game playing among Norwegian adults: self-reported consequences of playing and association with mental health problems.

    Science.gov (United States)

    Wenzel, H G; Bakken, I J; Johansson, A; Götestam, K G; Øren, Anita

    2009-12-01

    Computer games are the most advanced form of gaming. For most people, the playing is an uncomplicated leisure activity; however, for a minority the gaming becomes excessive and is associated with negative consequences. The aim of the present study was to investigate computer game-playing behaviour in the general adult Norwegian population, and to explore mental health problems and self-reported consequences of playing. The survey includes 3,405 adults 16 to 74 years old (Norway 2007, response rate 35.3%). Overall, 65.5% of the respondents reported having ever played computer games (16-29 years, 93.9%; 30-39 years, 85.0%; 40-59 years, 56.2%; 60-74 years, 25.7%). Among 2,170 players, 89.8% reported playing less than 1 hr. as a daily average over the last month, 5.0% played 1-2 hr. daily, 3.1% played 2-4 hr. daily, and 2.2% reported playing > 4 hr. daily. The strongest risk factor for playing > 4 hr. daily was being an online player, followed by male gender, and single marital status. Reported negative consequences of computer game playing increased strongly with average daily playing time. Furthermore, prevalence of self-reported sleeping problems, depression, suicide ideations, anxiety, obsessions/ compulsions, and alcohol/substance abuse increased with increasing playing time. This study showed that adult populations should also be included in research on computer game-playing behaviour and its consequences.

  12. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  13. Computer-Based Learning in Chemistry Classes

    Science.gov (United States)

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  14. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.

  15. Physical consequences of the mitochondrial targeting of single-walled carbon nanotubes probed computationally

    Science.gov (United States)

    Chistyakov, V. A.; Zolotukhin, P. V.; Prazdnova, E. V.; Alperovich, I.; Soldatov, A. V.

    2015-06-01

    Experiments by F. Zhou and coworkers (2010) [16] showed that mitochondria are the main target of the cellular accumulation of single-walled carbon nanotubes (SWCNTs). Our in silico experiments, based on geometrical optimization of the system consisting of SWCNT+proton within Density Functional Theory, revealed that protons can bind to the outer side of SWCNT so generating a positive charge. Calculation results allow one to propose the following mechanism of SWCNTs mitochondrial targeting. SWCNTs enter the space between inner and outer membranes of mitochondria, where the excess of protons has been formed by diffusion. In this compartment SWCNTs are loaded with protons and acquire positive charges distributed over their surface. Protonation of hydrophobic SWCNTs can also be carried out within the mitochondrial membrane through interaction with the protonated ubiquinone. Such "charge loaded" particles can be transferred as "Sculachev ions" through the inner membrane of the mitochondria due to the potential difference generated by the inner membrane. Physiological consequences of the described mechanism are discussed.

  16. Identifying controlling variables for math computation fluency through experimental analysis: the interaction of stimulus control and reinforcing consequences.

    Science.gov (United States)

    Hofstadter-Duke, Kristi L; Daly, Edward J

    2015-03-01

    This study investigated a method for conducting experimental analyses of academic responding. In the experimental analyses, academic responding (math computation), rather than problem behavior, was reinforced across conditions. Two separate experimental analyses (one with fluent math computation problems and one with non-fluent math computation problems) were conducted with three elementary school children using identical contingencies while math computation rate was measured. Results indicate that the experimental analysis with non-fluent problems produced undifferentiated responding across participants; however, differentiated responding was achieved for all participants in the experimental analysis with fluent problems. A subsequent comparison of the single-most effective condition from the experimental analyses replicated the findings with novel computation problems. Results are discussed in terms of the critical role of stimulus control in identifying controlling consequences for academic deficits, and recommendations for future research refining and extending experimental analysis to academic responding are made. © The Author(s) 2014.

  17. Computer-Based Cognitive Training in Aging.

    Science.gov (United States)

    Klimova, Blanka

    2016-01-01

    At present there is a rapid growth of aging population groups worldwide, which brings about serious economic and social problems. Thus, there is considerable effort to prolong the active life of these older people and keep them independent. The purpose of this mini review is to explore available clinical studies implementing computer-based cognitive training programs as intervention tools in the prevention and delay of cognitive decline in aging, with a special focus on their effectiveness. This was done by conducting a literature search in the databases Web of Science, Scopus, MEDLINE and Springer, and consequently by evaluating the findings of the relevant studies. The findings show that computerized cognitive training can lead to the improvement of cognitive functions such as working memory and reasoning skills in particular. However, this training should be performed over a longer time span since a short-term cognitive training mainly has an impact on short-term memory with temporary effects. In addition, the training must be intense to become effective. Furthermore, the results indicate that it is important to pay close attention to the methodological standards in future clinical studies.

  18. ARGOS-NT: A computer based emergency management system

    International Nuclear Information System (INIS)

    Hoe, S.; Thykier-Nielsen, S.; Steffensen, L.B.

    2000-01-01

    In case of a nuclear accident or a threat of a release the Danish Emergency Management Agency is responsible for actions to minimize the consequences in Danish territory. To provide an overview of the situation, a computer based system called ARGOS-NT has been developed in 1993/94. This paper gives an overview of the system with emphasis on the prognostic part of the system. An example calculation shows the importance of correct landscape modeling. (author)

  19. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  20. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  1. Consequences of spatial autocorrelation for niche-based models

    DEFF Research Database (Denmark)

    Segurado, P.; Araújo, Miguel B.; Kunin, W. E.

    2006-01-01

    1.  Spatial autocorrelation is an important source of bias in most spatial analyses. We explored the bias introduced by spatial autocorrelation on the explanatory and predictive power of species' distribution models, and make recommendations for dealing with the problem. 2.  Analyses were based o...

  2. The Equity Consequences of School-Based Management

    Science.gov (United States)

    Nir, Adam E.; Miran, Meir

    2006-01-01

    Purpose: The purpose of this paper is to examine the extent to which the introduction of school-based management (SBM) affects schools' incomes and educational equity? Design/methodology/approach: An analysis of financial reports coming from 31 SBM schools during a period of four sequential years reveals that the overall inequity among schools has…

  3. Gender consequences of a national performance-based funding model

    DEFF Research Database (Denmark)

    Nielsen, Mathias Wullum

    2017-01-01

    -regarded’ and highly selective journals and book publishers, and 1 and 5 points for equivalent scientific contributions via ‘normal level’ channels. On the basis of bibliometric data, the study shows that the BRI considerably widens the existing gender gap in researcher performance, since men on average receive more......This article investigates the extent to which the Danish Bibliometric Research Indicator (BRI) reflects the performance of men and women differently. The model is based on a differentiated counting of peer-reviewed publications, awarding three and eight points for contributions to ‘well...... privileges collaborative research, which disadvantages women due to gender differences in collaborative network relations....

  4. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  5. Maillard reaction in mild-based foods: nutritional consequences.

    Science.gov (United States)

    Pizzoferrato, L; Manzi, P; Vivanti, V; Nicoletti, I; Corradini, C; Cogliandro, E

    1998-02-01

    Chemical reactions occurring during industrial treatments or storage foods can lead to the formation of epsilon-deoxyketosyl compounds, the Amadori products. Food protein value can be adversely affected by these reactions, and in particular lysine, an essential amino acid having on its side chain a free amino group, can be converted to nonbioavailable N-substituted lysine or blocked lysine. by acid hydrolysis of epsilon-deoxyketosyl compounds, furosine is formed. In this paper furosine prepared from milk-based commercial products has been evaluated by use of a recently developed HPLC method using a microbore column and phosphate buffer as the mobile phase at controlled temperature. Furosine levels have been used, together with protein, total amino acids, and lysine content, as an estimate of protein quality of a few different products such as cooked-cream dessert, yogurt mousse, white chocolate, milk chocolate, milk chocolate with a soft nougat and caramel center, milk chocolate with a whipped white center, chocolate spread, part-skim milk tablets, milk-based dietetic meals, and baby foods. The protein content of the analyzed products ranged from 34.3 gxkg(-1) (milk nougat) to 188.4 g x kg(-1) (milk tablets). The Maillard reaction caused a loss in available lysine that varied from 2.5% (cooked cream) to 36.2% (condensed milk). The contribution to the lysine average daily requirement is heavily affected by this reaction and varied from 13% (milk tablets and soft nougat) to 61% (dietetic meal). Variable results were also obtained for the other essential amino acids.

  6. Thoracoabdominal computed tomography in trauma patients: a cost-consequences analysis

    NARCIS (Netherlands)

    Vugt, R. van; Kool, D.R.; Brink, M.; Dekker, H.M.; Deunk, J.; Edwards, M.J.R.

    2014-01-01

    BACKGROUND: CT is increasingly used during the initial evaluation of blunt trauma patients. In this era of increasing cost-awareness, the pros and cons of CT have to be assessed. OBJECTIVES: This study was performed to evaluate cost-consequences of different diagnostic algorithms that use

  7. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  8. Computer-based feedback in formative assessment

    NARCIS (Netherlands)

    van der Kleij, Fabienne

    2013-01-01

    Formative assessment concerns any assessment that provides feedback that is intended to support learning and can be used by teachers and/or students. Computers could offer a solution to overcoming obstacles encountered in implementing formative assessment. For example, computer-based assessments

  9. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  10. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  11. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  12. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  13. CAMAC based computer--computer communications via microprocessor data links

    International Nuclear Information System (INIS)

    Potter, J.M.; Machen, D.R.; Naivar, F.J.; Elkins, E.P.; Simmonds, D.D.

    1976-01-01

    Communications between the central control computer and remote, satellite data acquisition/control stations at The Clinton P. Anderson Meson Physics Facility (LAMPF) is presently accomplished through the use of CAMAC based Data Link Modules. With the advent of the microprocessor, a new philosophy for digital data communications has evolved. Data Link modules containing microprocessor controllers provide link management and communication network protocol through algorithms executed in the Data Link microprocessor

  14. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  15. Computer-based diagnostic decisionmaking.

    Science.gov (United States)

    Miller, R A

    1987-12-01

    The three decisionmaking aids described by the authors attack the generic problem of "see no evil, hear no evil, speak no evil"--improving the detection, diagnosis, and therapy of psychiatric disorders in the primary care setting. The three systems represent interventions at different steps in the process of providing appropriate care to psychiatric patients. The DSPW system of Robins and Marcus offers the potential of increasing the recognition of psychiatric disease in the physician's office. Politser's IDS program is representative of the sort of sophisticated microcomputer-based decisionmaking support tools that will become available to physicians in the not-too-distant future. Erdman's study of the impact of explanation capabilities on the acceptability of therapy recommending systems points out the need for careful scientific evaluations of features added to diagnostic and therapeutic systems.

  16. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  17. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y.C. [Square Y Consultants, Orchard Park, NY (US); Chen, S.Y.; Biwer, B.M.; LePoire, D.J. [Argonne National Lab., IL (US)

    1995-11-01

    This report presents the technical details of RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, interactive program that can be run on an IBM or equivalent personal computer under the Windows{trademark} environment. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incident-free models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionuclide inventory and dose conversion factors. In addition, the flexibility of the models allows them to be used for assessing any accidental release involving radioactive materials. The RISKIND code allows for user-specified accident scenarios as well as receptor locations under various exposure conditions, thereby facilitating the estimation of radiological consequences and health risks for individuals. Median (50% probability) and typical worst-case (less than 5% probability of being exceeded) doses and health consequences from potential accidental releases can be calculated by constructing a cumulative dose/probability distribution curve for a complete matrix of site joint-wind-frequency data. These consequence results, together with the estimated probability of the entire spectrum of potential accidents, form a comprehensive, probabilistic risk assessment of a spent nuclear fuel transportation accident.

  18. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    International Nuclear Information System (INIS)

    Yuan, Y.C.; Chen, S.Y.; Biwer, B.M.; LePoire, D.J.

    1995-11-01

    This report presents the technical details of RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, interactive program that can be run on an IBM or equivalent personal computer under the Windows trademark environment. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incident-free models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionuclide inventory and dose conversion factors. In addition, the flexibility of the models allows them to be used for assessing any accidental release involving radioactive materials. The RISKIND code allows for user-specified accident scenarios as well as receptor locations under various exposure conditions, thereby facilitating the estimation of radiological consequences and health risks for individuals. Median (50% probability) and typical worst-case (less than 5% probability of being exceeded) doses and health consequences from potential accidental releases can be calculated by constructing a cumulative dose/probability distribution curve for a complete matrix of site joint-wind-frequency data. These consequence results, together with the estimated probability of the entire spectrum of potential accidents, form a comprehensive, probabilistic risk assessment of a spent nuclear fuel transportation accident

  19. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  20. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  1. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    International Nuclear Information System (INIS)

    Yuan, Y.C.; Chen, S.Y.; LePoire, D.J.

    1993-02-01

    This report presents the technical details of RISIUND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, semiinteractive program that can be run on an IBM or equivalent personal computer. The program language is FORTRAN-77. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incidentfree models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionudide inventory and dose conversion factors

  2. RISKIND: A computer program for calculating radiological consequences and health risks from transportation of spent nuclear fuel

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Y.C. [Square Y, Orchard Park, NY (United States); Chen, S.Y.; LePoire, D.J. [Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.; Rothman, R. [USDOE Idaho Field Office, Idaho Falls, ID (United States)

    1993-02-01

    This report presents the technical details of RISIUND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel. RISKIND is a user-friendly, semiinteractive program that can be run on an IBM or equivalent personal computer. The program language is FORTRAN-77. Several models are included in RISKIND that have been tailored to calculate the exposure to individuals under various incident-free and accident conditions. The incidentfree models assess exposures from both gamma and neutron radiation and can account for different cask designs. The accident models include accidental release, atmospheric transport, and the environmental pathways of radionuclides from spent fuels; these models also assess health risks to individuals and the collective population. The models are supported by databases that are specific to spent nuclear fuels and include a radionudide inventory and dose conversion factors.

  3. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  4. Diagnostic significance and therapeutic consequences of computed tomography (patient outcome research). Pt. 1. Diagnosis in traumatology

    International Nuclear Information System (INIS)

    Schroeder, R.J.; Hidajat, N.; Vogl, T.; Haas, N.; Suedkamp, N.; Schedel, H.; Felix, R.

    1995-01-01

    During 1993, 201 primary traumatologic patients underwent 230 computed tomography examinations. 87% of the CT's were performed completely without contrast media, 2.6% exclusively supported by intravenously given contrast media, 9.1% in both ways, and 1.3% after intra-articular contrast media administration. 97.4% served for primary diagnostic purposes and 2.6% for the control of therapeutic results. In 47.8% of the CT's, the principle diagnosis was known before CT. In 52.2%, the diagnosis without CT was impossible by other methods. The CT diagnoses were correctly positive in 58.7% and correctly negative in 41.3%. 60.9% of CT's demonstrated a missing indication for operation in the examined body region; in 39.1% the operation followed. (orig.) [de

  5. Application of data base management systems for developing experimental data base using ES computers

    International Nuclear Information System (INIS)

    Vasil'ev, V.I.; Karpov, V.V.; Mikhajlyuk, D.N.; Ostroumov, Yu.A.; Rumyantsev, A.N.

    1987-01-01

    Modern data base measurement systems (DBMS) are widely used for development and operation of different data bases by assignment of data processing systems in economy, planning, management. But up today development and operation of data masses with experimental physical data in ES computer has been based mainly on the traditional technology of consequent or index-consequent files. The principal statements of DBMS technology applicability for compiling and operation of data bases with data on physical experiments are formulated based on the analysis of DBMS opportunities. It is shown that application of DBMS allows to essentially reduce general costs of calculational resources for development and operation of data bases and to decrease the scope of stored experimental data when analyzing information content of data

  6. Consequences of Urban Stability Conditions for Computational Fluid Dynamics Simulations of Urban Dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Lundquist, J K; Chan, S T

    2005-11-30

    The validity of omitting stability considerations when simulating transport and dispersion in the urban environment is explored using observations from the Joint URBAN 2003 field experiment and computational fluid dynamics simulations of that experiment. Four releases of sulfur hexafluoride, during two daytime and two nighttime intensive observing periods, are simulated using the building-resolving computational fluid dynamics model, FEM3MP to solve the Reynolds Averaged Navier-Stokes equations with two options of turbulence parameterizations. One option omits stability effects but has a superior turbulence parameterization using a non-linear eddy viscosity (NEV) approach, while the other considers buoyancy effects with a simple linear eddy viscosity (LEV) approach for turbulence parameterization. Model performance metrics are calculated by comparison with observed winds and tracer data in the downtown area, and with observed winds and turbulence kinetic energy (TKE) profiles at a location immediately downwind of the central business district (CBD) in the area we label as the urban shadow. Model predictions of winds, concentrations, profiles of wind speed, wind direction, and friction velocity are generally consistent with and compare reasonably well with the field observations. Simulations using the NEV turbulence parameterization generally exhibit better agreement with observations. To further explore this assumption of a neutrally-stable atmosphere within the urban area, TKE budget profiles slightly downwind of the urban wake region in the 'urban shadow' are examined. Dissipation and shear production are the largest terms which may be calculated directly. The advection of TKE is calculated as a residual; as would be expected downwind of an urban area, the advection of TKE produced within the urban area is a very large term. Buoyancy effects may be neglected in favor of advection, shear production, and dissipation. For three of the IOPs, buoyancy

  7. Consequences of fiducial marker error on three-dimensional computer animation of the temporomandibular joint

    Science.gov (United States)

    Leader, J. Ken, III; Boston, J. Robert; Rudy, Thomas E.; Greco, Carol M.; Zaki, Hussein S.

    2001-05-01

    Jaw motion has been used to diagnose jaw pain patients, and we have developed a 3D computer animation technique to study jaw motion. A customized dental clutch was worn during motion, and its consistent and rigid placement was a concern. The experimental protocol involved mandibular movements (vertical opening) and MR imaging. The clutch contained three motion markers used to collect kinematic data and four MR markers used as fiducial markers in the MR images. Fiducial marker misplacement was mimicked by analytically perturbing the position of the MR markers +/- 2, +/- 4, and +/- 6 degrees in the three anatomical planes. The percent difference between the original and perturbed MR marker position was calculated for kinematic parameters. The maximum difference across all perturbations for axial rotation, coronal rotation, sagittal rotation, axial translation, coronal translation, and sagittal translation were 176.85%, 191.84%, 0.64%, 9.76%, 80.75%, and 8.30%, respectively, for perturbing all MR markers, and 86.47%, 93.44%, 0.23%, 7.08%, 42.64%, and 13.64%, respectively, for perturbing one MR marker. The parameters representing movement in the sagittal plane, the dominant plane in vertical opening, were determined to be reasonably robust, while secondary movements in the axial and coronal planes were not considered robust.

  8. Computer-Game-Based Tutoring of Mathematics

    Science.gov (United States)

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  9. A CAMAC-based laboratory computer system

    International Nuclear Information System (INIS)

    Westphal, G.P.

    1975-01-01

    A CAMAC-based laboratory computer network is described by sharing a common mass memory this offers distinct advantages over slow and core-consuming single-processor installations. A fast compiler-BASIC, with extensions for CAMAC and real-time, provides a convenient means for interactive experiment control

  10. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  11. Computer based training: Technology and trends

    International Nuclear Information System (INIS)

    O'Neal, A.F.

    1986-01-01

    Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT

  12. Internet-based mental health services in Norway and Sweden: characteristics and consequences.

    Science.gov (United States)

    Andersen, Anders Johan W; Svensson, Tommy

    2013-03-01

    Internet-based mental health services increase rapidly. However, national surveys are incomplete and the consequences for such services are poorly discussed. This study describes characteristics of 60 Internet-based mental health services in Norway and Sweden and discusses their social consequences. More than half of the services were offered by voluntary organisations and targeted towards young people. Professionals answered service users' questions in 60% of the services. Eight major themes were identified. These characteristics may indicate a shift in the delivery of mental health services in both countries, and imply changes in the understanding of mental health.

  13. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  14. Computer-Based Wireless Advertising Communication System

    Directory of Open Access Journals (Sweden)

    Anwar Al-Mofleh

    2009-10-01

    Full Text Available In this paper we developed a computer based wireless advertising communication system (CBWACS that enables the user to advertise whatever he wants from his own office to the screen in front of the customer via wireless communication system. This system consists of two PIC microcontrollers, transmitter, receiver, LCD, serial cable and antenna. The main advantages of the system are: the wireless structure and the system is less susceptible to noise and other interferences because it uses digital communication techniques.

  15. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  16. Evaluation of Current Computer Models Applied in the DOE Complex for SAR Analysis of Radiological Dispersion & Consequences

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K. R. [Savannah River Site (SRS), Aiken, SC (United States); East, J. M. [Savannah River Site (SRS), Aiken, SC (United States); Weber, A. H. [Savannah River Site (SRS), Aiken, SC (United States); Savino, A. V. [Savannah River Site (SRS), Aiken, SC (United States); Mazzola, C. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2003-01-01

    The evaluation of atmospheric dispersion/ radiological dose analysis codes included fifteen models identified in authorization basis safety analysis at DOE facilities, or from regulatory and research agencies where past or current work warranted inclusion of a computer model. All computer codes examined were reviewed using general and specific evaluation criteria developed by the Working Group. The criteria were based on DOE Orders and other regulatory standards and guidance for performing bounding and conservative dose calculations. Included were three categories of criteria: (1) Software Quality/User Interface; (2) Technical Model Adequacy; and (3) Application/Source Term Environment. A consensus-based limited quantitative ranking process was used to base an order of model preference as both an overall conclusion, and under specific conditions.

  17. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  18. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  19. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  20. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  1. Computational chemistry and metal-based radiopharmaceuticals

    International Nuclear Information System (INIS)

    Neves, M.; Fausto, R.

    1998-01-01

    Computer-assisted techniques have found extensive use in the design of organic pharmaceuticals but have not been widely applied on metal complexes, particularly on radiopharmaceuticals. Some examples of computer generated structures of complexes of In, Ga and Tc with N, S, O and P donor ligands are referred. Besides parameters directly related with molecular geometries, molecular properties of the predicted structures, as ionic charges or dipole moments, are considered to be related with biodistribution studies. The structure of a series of oxo neutral Tc-biguanide complexes are predicted by molecular mechanics calculations, and their interactions with water molecules or peptide chains correlated with experimental data of partition coefficients and percentage of human protein binding. The results stress the interest of using molecular modelling to predict molecular properties of metal-based radiopharmaceuticals, which can be successfully correlated with results of in vitro studies. (author)

  2. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  3. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  4. A computer-based purchase management system

    International Nuclear Information System (INIS)

    Kuriakose, K.K.; Subramani, M.G.

    1989-01-01

    The details of a computer-based purchase management system developed to meet the specific requirements of Madras Regional Purchase Unit (MRPU) is given. Howe ver it can be easily modified to meet the requirements of any other purchase department. It covers various operations of MRPU starting from indent processing to preparation of purchase orders and reminders. In order to enable timely management action and control facilities are provided to generate the necessary management information reports. The scope for further work is also discussed. The system is completely menu driven and user friendly. Appendix A and B contains the menu implemented and the sample outputs respectively. (author)

  5. Secure information transfer based on computing reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Szmoski, R.M.; Ferrari, F.A.S. [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Pinto, S.E. de S, E-mail: desouzapinto@pq.cnpq.br [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Baptista, M.S. [Institute for Complex Systems and Mathematical Biology, SUPA, University of Aberdeen, Aberdeen (United Kingdom); Viana, R.L. [Department of Physics, Universidade Federal do Parana, 81531-990, Curitiba, Parana (Brazil)

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on–off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  6. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  7. GPU-based cone beam computed tomography.

    Science.gov (United States)

    Noël, Peter B; Walczak, Alan M; Xu, Jinhui; Corso, Jason J; Hoffmann, Kenneth R; Schafer, Sebastian

    2010-06-01

    The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  8. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  9. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  10. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  11. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  12. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  13. Computer-based training at Sellafield

    International Nuclear Information System (INIS)

    Cartmell, A.; Evans, M.C.

    1986-01-01

    British Nuclear Fuel Limited (BNFL) operate the United Kingdom's spent-fuel receipt, storage, and reprocessing complex at Sellafield. Spent fuel from graphite-moderated CO 2 -cooled Magnox reactors has been reprocessed at Sellafield for 22 yr. Spent fuel from light water and advanced gas reactors is stored pending reprocessing in the Thermal Oxide Reprocessing Plant currently being constructed. The range of knowledge and skills needed for plant operation, construction, and commissioning represents a formidable training requirement. In addition, employees need to be acquainted with company practices and procedures. Computer-based training (CBT) is expected to play a significant role in this process. In this paper, current applications of CBT to the filed of nuclear criticality safety are described and plans for the immediate future are outlined

  14. Computer based training for oil spill management

    International Nuclear Information System (INIS)

    Goodman, R.

    1993-01-01

    Large oil spills are infrequent occurrences, which poses a particular problem for training oil spill response staff and for maintaining a high level of response readiness. Conventional training methods involve table-top simulations to develop tactical and strategic response skills and boom-deployment exercises to maintain operational readiness. Both forms of training are quite effective, but they are very time-consuming to organize, are expensive to conduct, and tend to become repetitious. To provide a variety of response experiences, a computer-based system of oil spill response training has been developed which can supplement a table-top training program. Using a graphic interface, a realistic and challenging computerized oil spill response simulation has been produced. Integral to the system is a program editing tool which allows the teacher to develop a custom training exercise for the area of interest to the student. 1 ref

  15. A High Performance COTS Based Computer Architecture

    Science.gov (United States)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  16. Computer Networks as a New Data Base.

    Science.gov (United States)

    Beals, Diane E.

    1992-01-01

    Discusses the use of communication on computer networks as a data source for psychological, social, and linguistic research. Differences between computer-mediated communication and face-to-face communication are described, the Beginning Teacher Computer Network is discussed, and examples of network conversations are appended. (28 references) (LRW)

  17. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  18. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  19. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  20. BASES OF STRESS AND ITS CONSEQUENCES THERAPY AND PROPHYLAXIS IN CHILDREN AND ADOLESCENTS

    Directory of Open Access Journals (Sweden)

    E. S. Akarachkova

    2013-01-01

    Full Text Available Physical health of a child is inseparable from his emotional state. At present time it is defined that stress and negative life situations (for example, sudden changes in environment and order of day, beginning and ending of school year, examinations and preparation to them, parents’ divorce or dismissal, parting with close friends, interviews etc. are responsible for increase of distress symptoms in children and adolescents and decrease their ability to self-control. Moreover psychological processes interfere in life activity and can lead to negative consequences in future. The state that emotional and other types of stress can prevent normal psychological and social development in many children is proved in this article. Such disturbances can lead to severe long-term consequences and increase the demand in medical resources, which requires measures aimed to improvement of resistance to stress in childhood and adolescence. During this millennium methods of therapeutic and preventative measures, based on molecular and cellular magnesium-dependent pathogenetic mechanisms of stress formation and consequences of magnesium insufficiency, gain more and more popularity. Recommendations on following certain motional regimen and appropriate for stress condition nutrition do not lose their urgency.

  1. A guide to the use of TIRION. A computer programme for the calculation of the consequences of releasing radioactive material to the atmosphere

    International Nuclear Information System (INIS)

    Kaiser, G.D.

    1976-11-01

    A brief description is given of the contents of TIRION, which is a computer program that has been written for use in calculations of the consequences of releasing radioactive material to the atmosphere. This is followed by a section devoted to an account of the control and data cards that make up the input to TIRION. (author)

  2. Novel computer-based endoscopic camera

    Science.gov (United States)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  3. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  4. Verification of computer system PROLOG - software tool for rapid assessments of consequences of short-term radioactive releases to the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kiselev, Alexey A.; Krylov, Alexey L.; Bogatov, Sergey A. [Nuclear Safety Institute (IBRAE), Bolshaya Tulskaya st. 52, 115191, Moscow (Russian Federation)

    2014-07-01

    In case of nuclear and radiation accidents emergency response authorities require a tool for rapid assessments of possible consequences. One of the most significant problems is lack of data on initial state of an accident. The lack can be especially critical in case the accident occurred in a location that was not thoroughly studied beforehand (during transportation of radioactive materials for example). One of possible solutions is the hybrid method when a model that enables rapid assessments with the use of reasonable minimum of input data is used conjointly with an observed data that can be collected shortly after accidents. The model is used to estimate parameters of the source and uncertain meteorological parameters on the base of some observed data. For example, field of fallout density can be observed and measured within hours after an accident. After that the same model with the use of estimated parameters is used to assess doses and necessity of recommended and mandatory countermeasures. The computer system PROLOG was designed to solve the problem. It is based on the widely used Gaussian model. The standard Gaussian model is supplemented with several sub-models that allow to take into account: polydisperse aerosols, aerodynamic shade from buildings in the vicinity of the place of accident, terrain orography, initial size of the radioactive cloud, effective height of the release, influence of countermeasures on the doses of radioactive exposure of humans. It uses modern GIS technologies and can use web map services. To verify ability of PROLOG to solve the problem it is necessary to test its ability to assess necessary parameters of real accidents in the past. Verification of the computer system on the data of Chazhma Bay accident (Russian Far East, 1985) was published previously. In this work verification was implemented on the base of observed contamination from the Kyshtym disaster (PA Mayak, 1957) and the Tomsk accident (1993). Observations of Sr-90

  5. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  6. [Efficiency of computer-based documentation in long-term care--preliminary project].

    Science.gov (United States)

    Lüngen, Markus; Gerber, Andreas; Rupprecht, Christoph; Lauterbach, Karl W

    2008-06-01

    In Germany the documentation of processes in long-term care is mainly paper-based. Planning, realization and evaluation are not supported in an optimal way. In a preliminary study we evaluated the consequences of the introduction of a computer-based documentation system using handheld devices. We interviewed 16 persons before and after introducing the computer-based documentation and assessed costs for the documentation process and administration. The results show that reducing costs is likely. The job satisfaction of the personnel increased, more time could be spent for caring for the residents. We suggest further research to reach conclusive results.

  7. Blind topological measurement-based quantum computation.

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  8. Personal Computer Based Controller For Switched Reluctance Motor Drives

    Science.gov (United States)

    Mang, X.; Krishnan, R.; Adkar, S.; Chandramouli, G.

    1987-10-01

    Th9, switched reluctance motor (SRM) has recently gained considerable attention in the variable speed drive market. Two important factors that have contributed to this are, the simplicity of construction and the possibility of developing low cost con-trollers with minimum number of switching devices in the drive circuits. This is mainly due to the state-of-art of the present digital circuits technology and the low cost of switching devices. The control of this motor drive is under research. Optimized performance of the SRM motor drive is very dependent on the integration of the controller, converter and the motor. This research on system integration involves considerable changes in the control algorithms and their implementation. A Personal computer (PC) based controller is very appropriate for this purpose. Accordingly, the present paper is concerned with the design of a PC based controller for a SRM. The PC allows for real-time microprocessor control with the possibility of on-line system parameter modifications. Software reconfiguration of this controller is easier than a hardware based controller. User friendliness is a natural consequence of such a system. Considering the low cost of PCs, this controller will offer an excellent cost-effective means of studying the control strategies for the SRM drive intop greater detail than in the past.

  9. Computer Based Training Authors' and Designers' training

    Directory of Open Access Journals (Sweden)

    Frédéric GODET

    2016-03-01

    Full Text Available This communication, through couple of studies driven since 10 years, tries to show how important is the training of authors in Computer Based Training (CBT. We submit here an approach to prepare designers mastering Interactive Multimedia modules in this domain. Which institutions are really dedicating their efforts in training authors and designers in this area of CBTs? Television devices and broadcast organisations offered since year 60s' a first support for Distance Learning. New media, New Information and Communication Technologies (NICT allowed several public and private organisations to start Distance Learning projects. As usual some of them met their training objectives, other of them failed. Did their really failed? Currently, nobody has the right answer. Today, we do not have enough efficient tools allowing us to evaluate trainees' acquisition in a short term view. Training evaluation needs more than 10 to 20 years of elapsed time to bring reliable measures. Nevertheless, given the high investments already done in this area, we cannot wait until the final results of the pedagogical evaluation. A lot of analyses showed relevant issues which can be used as directions for CBTs authors and designers training. Warning - Our studies and the derived conclusions are mainly based on projects driven in the field. We additionally bring our several years experience in the training of movie film authors in the design of interactive multimedia products. Some of our examples are extracting from vocational training projects where we were involved in all development phases from the analysis of needs to the evaluation of the acquisition within the trainee's / employee job's. Obviously, we cannot bring and exhaustive approach in this domain where a lot of parameters are involved as frame for the CBT interactive multimedia modules authors' and designers' training.

  10. A quantum computer based on recombination processes in microelectronic devices

    International Nuclear Information System (INIS)

    Theodoropoulos, K; Ntalaperas, D; Petras, I; Konofaos, N

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices

  11. Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation

    Science.gov (United States)

    1989-08-01

    1757 I Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation DTIC5 by flELECTE 5David C. Wilkins and Yong...NUMBERSWOKNI PROGRAM RAT TSWOKUI 61153N RR04206 OC 443g-008 11 TITLE (Include Security Classification) Sociopathic Knowledge Bases: Correct Knowledge Can be...probabilistic rules are shown to be sociopathic and so this problem is very widespread. Sociopathicity has important consequences for rule induction

  12. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  13. Comprehensive transportation risk assessment system based on unit-consequence factors

    International Nuclear Information System (INIS)

    Biwer, B.M.; Monette, F.A.; LePoire, D.J.; Chen, S.Y.

    1994-01-01

    The U.S. Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement requires a comprehensive transportation risk analysis of radioactive waste shipments for large shipping campaigns. Thousands of unique shipments involving truck and rail transport must be analyzed; a comprehensive risk analysis is impossible with currently available methods. Argonne National Laboratory developed a modular transportation model that can handle the demands imposed by such an analysis. The modular design of the model facilitates the simple addition/updating of transportation routes and waste inventories, as required, and reduces the overhead associated with file maintenance and quality assurance. The model incorporates unit-consequences factors generated with the RADTRAN 4 transportation risk analysis code that are combined with an easy-to-use, menu-driven interface on IBM-compatible computers running under DOS. User selection of multiple origin/destination site pairs for the shipment of multiple radioactive waste inventories is permitted from pop-up lists. Over 800 predefined routes are available among more than 30 DOE sites and waste inventories that include high-level waste, spent nuclear fuel, transuranic waste, low-level waste, low-level mixed waste, and greater-than-Class C waste

  14. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  15. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  16. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  17. Small Computer Applications for Base Supply.

    Science.gov (United States)

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  18. 26 CFR 1.809-10 - Computation of equity base.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Computation of equity base. 1.809-10 Section 1... (CONTINUED) INCOME TAXES Gain and Loss from Operations § 1.809-10 Computation of equity base. (a) In general. For purposes of section 809, the equity base of a life insurance company includes the amount of any...

  19. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  20. Gap models and their individual-based relatives in the assessment of the consequences of global change

    Science.gov (United States)

    Shugart, Herman H.; Wang, Bin; Fischer, Rico; Ma, Jianyong; Fang, Jing; Yan, Xiaodong; Huth, Andreas; Armstrong, Amanda H.

    2018-03-01

    Individual-based models (IBMs) of complex systems emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. Ecological IBMs arose with seemingly independent origins out of the tradition of understanding the ecosystems dynamics of ecosystems from a ‘bottom-up’ accounting of the interactions of the parts. Individual trees are principal among the parts of forests. Because these models are computationally demanding, they have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. This review will focus on a class of forest IBMs called gap models. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on a small plot of land. The summation of these plots comprise a forest (or set of sample plots on a forested landscape or region). Other, more aggregated forest IBMs have been used in global applications including cohort-based models, ecosystem demography models, etc. Gap models have been used to provide the parameters for these bulk models. Currently, gap models have grown from local-scale to continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. Our objective in this review is to provide the reader with an overview of the history, motivation and applications, including theoretical applications, of these models. In a time of concern over global changes, gap models are essential tools to understand forest responses to climate change, modified disturbance regimes and other change agents. Development of forest surveys to provide the starting points for simulations and better estimates of the behavior of the diversity of tree species in response to the environment are continuing needs for improvement for these and other IBMs.

  1. Evaluation of computer-based ultrasonic inservice inspection systems

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems

  2. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  3. Total variation-based neutron computed tomography

    Science.gov (United States)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  4. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  5. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  6. Computer-based visual communication in aphasia.

    Science.gov (United States)

    Steele, R D; Weinrich, M; Wertz, R T; Kleczewska, M K; Carlson, G S

    1989-01-01

    The authors describe their recently developed Computer-aided VIsual Communication (C-VIC) system, and report results of single-subject experimental designs probing its use with five chronic, severely impaired aphasic individuals. Studies replicate earlier results obtained with a non-computerized system, demonstrate patient competence with the computer implementation, extend the system's utility, and identify promising areas of application. Results of the single-subject experimental designs clarify patients' learning, generalization, and retention patterns, and highlight areas of performance difficulties. Future directions for the project are indicated.

  7. A brain computer interface-based explorer.

    Science.gov (United States)

    Bai, Lijuan; Yu, Tianyou; Li, Yuanqing

    2015-04-15

    In recent years, various applications of brain computer interfaces (BCIs) have been studied. In this paper, we present a hybrid BCI combining P300 and motor imagery to operate an explorer. Our system is mainly composed of a BCI mouse, a BCI speller and an explorer. Through this system, the user can access his computer and manipulate (open, close, copy, paste, and delete) files such as documents, pictures, music, movies and so on. The system has been tested with five subjects, and the experimental results show that the explorer can be successfully operated according to subjects' intentions. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Dosimetric consequences of the shift towards computed tomography guided target definition and planning for breast conserving radiotherapy

    Directory of Open Access Journals (Sweden)

    Korevaar Erik W

    2008-01-01

    Full Text Available Abstract Background The shift from conventional two-dimensional (2D to three-dimensional (3D-conformal target definition and dose-planning seems to have introduced volumetric as well as geometric changes. The purpose of this study was to compare coverage of computed tomography (CT-based breast and boost planning target volumes (PTV, absolute volumes irradiated, and dose delivered to the organs at risk with conventional 2D and 3D-conformal breast conserving radiotherapy. Methods Twenty-five patients with left-sided breast cancer were subject of CT-guided target definition and 3D-conformal dose-planning, and conventionally defined target volumes and treatment plans were reconstructed on the planning CT. Accumulated dose-distributions were calculated for the conventional and 3D-conformal dose-plans, taking into account a prescribed dose of 50 Gy for the breast plans and 16 Gy for the boost plans. Results With conventional treatment plans, CT-based breast and boost PTVs received the intended dose in 78% and 32% of the patients, respectively, and smaller volumes received the prescribed breast and boost doses compared with 3D-conformal dose-planning. The mean lung dose, the volume of the lungs receiving > 20 Gy, the mean heart dose, and volume of the heart receiving > 30 Gy were significantly less with conventional treatment plans. Specific areas within the breast and boost PTVs systematically received a lower than intended dose with conventional treatment plans. Conclusion The shift towards CT-guided target definition and planning as the golden standard for breast conserving radiotherapy has resulted in improved target coverage at the cost of larger irradiated volumes and an increased dose delivered to organs at risk. Tissue is now included into the breast and boost target volumes that was never explicitly defined or included with conventional treatment. Therefore, a coherent definition of the breast and boost target volumes is needed, based on

  9. Dosimetric consequences of the shift towards computed tomography guided target definition and planning for breast conserving radiotherapy

    International Nuclear Information System (INIS)

    Laan, Hans Paul van der; Dolsma, Wil V; Maduro, John H; Korevaar, Erik W; Langendijk, Johannes A

    2008-01-01

    The shift from conventional two-dimensional (2D) to three-dimensional (3D)-conformal target definition and dose-planning seems to have introduced volumetric as well as geometric changes. The purpose of this study was to compare coverage of computed tomography (CT)-based breast and boost planning target volumes (PTV), absolute volumes irradiated, and dose delivered to the organs at risk with conventional 2D and 3D-conformal breast conserving radiotherapy. Twenty-five patients with left-sided breast cancer were subject of CT-guided target definition and 3D-conformal dose-planning, and conventionally defined target volumes and treatment plans were reconstructed on the planning CT. Accumulated dose-distributions were calculated for the conventional and 3D-conformal dose-plans, taking into account a prescribed dose of 50 Gy for the breast plans and 16 Gy for the boost plans. With conventional treatment plans, CT-based breast and boost PTVs received the intended dose in 78% and 32% of the patients, respectively, and smaller volumes received the prescribed breast and boost doses compared with 3D-conformal dose-planning. The mean lung dose, the volume of the lungs receiving > 20 Gy, the mean heart dose, and volume of the heart receiving > 30 Gy were significantly less with conventional treatment plans. Specific areas within the breast and boost PTVs systematically received a lower than intended dose with conventional treatment plans. The shift towards CT-guided target definition and planning as the golden standard for breast conserving radiotherapy has resulted in improved target coverage at the cost of larger irradiated volumes and an increased dose delivered to organs at risk. Tissue is now included into the breast and boost target volumes that was never explicitly defined or included with conventional treatment. Therefore, a coherent definition of the breast and boost target volumes is needed, based on clinical data confirming tumour control probability and normal

  10. A guide to TIRION 4 - a computer code for calculating the consequences of releasing radioactive material to the atmosphere

    International Nuclear Information System (INIS)

    Fryer, L.S.

    1978-12-01

    TIRION 4 is the most recent program in a series designed to calculate the consequences of releasing radioactive material to the atmosphere. A brief description of the models used in the program and full details of the various control cards necessary to run TIRION 4 are given. (author)

  11. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2013-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  12. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Lasserre, J.B.; Laurent, M.; Mourrain, B.; Rostalski, P.; Trébuchet, P.

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming its complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite

  13. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2011-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  14. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  15. Efficient GPU-based skyline computation

    DEFF Research Database (Denmark)

    Bøgh, Kenneth Sejdenfaden; Assent, Ira; Magnani, Matteo

    2013-01-01

    The skyline operator for multi-criteria search returns the most interesting points of a data set with respect to any monotone preference function. Existing work has almost exclusively focused on efficiently computing skylines on one or more CPUs, ignoring the high parallelism possible in GPUs. In...

  16. Risk-based input-output analysis of influenza epidemic consequences on interdependent workforce sectors.

    Science.gov (United States)

    Santos, Joost R; May, Larissa; Haimar, Amine El

    2013-09-01

    Outbreaks of contagious diseases underscore the ever-looming threat of new epidemics. Compared to other disasters that inflict physical damage to infrastructure systems, epidemics can have more devastating and prolonged impacts on the population. This article investigates the interdependent economic and productivity risks resulting from epidemic-induced workforce absenteeism. In particular, we develop a dynamic input-output model capable of generating sector-disaggregated economic losses based on different magnitudes of workforce disruptions. An ex post analysis of the 2009 H1N1 pandemic in the national capital region (NCR) reveals the distribution of consequences across different economic sectors. Consequences are categorized into two metrics: (i) economic loss, which measures the magnitude of monetary losses incurred in each sector, and (ii) inoperability, which measures the normalized monetary losses incurred in each sector relative to the total economic output of that sector. For a simulated mild pandemic scenario in NCR, two distinct rankings are generated using the economic loss and inoperability metrics. Results indicate that the majority of the critical sectors ranked according to the economic loss metric comprise of sectors that contribute the most to the NCR's gross domestic product (e.g., federal government enterprises). In contrast, the majority of the critical sectors generated by the inoperability metric include sectors that are involved with epidemic management (e.g., hospitals). Hence, prioritizing sectors for recovery necessitates consideration of the balance between economic loss, inoperability, and other objectives. Although applied specifically to the NCR, the proposed methodology can be customized for other regions. © 2012 Society for Risk Analysis.

  17. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  18. Workgroup report: base stations and wireless networks-radiofrequency (RF) exposures and health consequences.

    Science.gov (United States)

    Valberg, Peter A; van Deventer, T Emilie; Repacholi, Michael H

    2007-03-01

    Radiofrequency (RF) waves have long been used for different types of information exchange via the air waves--wireless Morse code, radio, television, and wireless telephone (i.e., construction and operation of telephones or telephone systems). Increasingly larger numbers of people rely on mobile telephone technology, and health concerns about the associated RF exposure have been raised, particularly because the mobile phone handset operates in close proximity to the human body, and also because large numbers of base station antennas are required to provide widespread availability of service to large populations. The World Health Organization convened an expert workshop to discuss the current state of cellular-telephone health issues, and this article brings together several of the key points that were addressed. The possibility of RF health effects has been investigated in epidemiology studies of cellular telephone users and workers in RF occupations, in experiments with animals exposed to cell-phone RF, and via biophysical consideration of cell-phone RF electric-field intensity and the effect of RF modulation schemes. As summarized here, these separate avenues of scientific investigation provide little support for adverse health effects arising from RF exposure at levels below current international standards. Moreover, radio and television broadcast waves have exposed populations to RF for > 50 years with little evidence of deleterious health consequences. Despite unavoidable uncertainty, current scientific data are consistent with the conclusion that public exposures to permissible RF levels from mobile telephone and base stations are not likely to adversely affect human health.

  19. Workgroup Report: Base Stations and Wireless Networks—Radiofrequency (RF) Exposures and Health Consequences

    Science.gov (United States)

    Valberg, Peter A.; van Deventer, T. Emilie; Repacholi, Michael H.

    2007-01-01

    Radiofrequency (RF) waves have long been used for different types of information exchange via the airwaves—wireless Morse code, radio, television, and wireless telephony (i.e., construction and operation of telephones or telephonic systems). Increasingly larger numbers of people rely on mobile telephone technology, and health concerns about the associated RF exposure have been raised, particularly because the mobile phone handset operates in close proximity to the human body, and also because large numbers of base station antennas are required to provide widespread availability of service to large populations. The World Health Organization convened an expert workshop to discuss the current state of cellular-telephone health issues, and this article brings together several of the key points that were addressed. The possibility of RF health effects has been investigated in epidemiology studies of cellular telephone users and workers in RF occupations, in experiments with animals exposed to cell-phone RF, and via biophysical consideration of cell-phone RF electric-field intensity and the effect of RF modulation schemes. As summarized here, these separate avenues of scientific investigation provide little support for adverse health effects arising from RF exposure at levels below current international standards. Moreover, radio and television broadcast waves have exposed populations to RF for > 50 years with little evidence of deleterious health consequences. Despite unavoidable uncertainty, current scientific data are consistent with the conclusion that public exposures to permissible RF levels from mobile telephony and base stations are not likely to adversely affect human health. PMID:17431492

  20. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  1. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  2. Computer-based learning for the enhancement of breastfeeding ...

    African Journals Online (AJOL)

    In this study, computer-based learning (CBL) was explored in the context of breastfeeding training for undergraduate Dietetic students. Aim: To adapt and validate an Indian computer-based undergraduate breastfeeding training module for use by South African undergraduate Dietetic students. Methods and materials: The ...

  3. Women and Computer Based Technologies: A Feminist Perspective.

    Science.gov (United States)

    Morritt, Hope

    The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…

  4. Greenhouse-gas Consequences of US Corn-based Ethanol in a Flat World

    Science.gov (United States)

    Davidson, E. A.; Coe, M. T.; Nepstad, D. C.; Donner, S. D.; Bustamante, M. M.; Neill, C.

    2008-12-01

    Competition for arable land is now occurring among food, fiber, and fuel production sectors. In the USA, increased corn production for ethanol has come primarily at the expense of reduced soybean production. Only a few countries, mainly Brazil, have appropriate soils, climate, and infrastructure needed for large absolute increases in cropped area in the next decade that could make up the lost US soybean production. Our objective is to improve estimates of the potential net greenhouse gas (GHG) consequences, both domestically and in Brazil, of meeting the new goals established by the US Congress for expansion of corn- based ethanol in the USA. To meet this goal of 57 billion liters per year of corn-based ethanol production, an additional 1-7 million hectares will need to be planted in corn, depending upon assumptions regarding future increases in corn yield. Net GHG emissions saved in the USA by substituting ethanol for gasoline are estimated at 14 Tg CO2-equivalents once the production goal of 57 million L/yr is reached. If reduced US soybean production caused by this increase in US corn planting results in a compensatory increase in Brazilian production of soybeans in the Cerrado and Amazon regions, we estimate a potential net release of 1800 to 9100 Tg CO2-equivalents of GHG emissions due to land-use change. Many opportunities exist for agricultural intensification that would minimize new land clearing and its environmental impacts, but if Brazilian deforestation is held to only 15% of the area estimated here to compensate lost US soybean production, the GHG mitigation of US corn-based ethanol production during the next 15 years would be more than offset by emissions from Brazilian land-use change. Other motivations for advancing corn-based ethanol production in the USA, such as reduced reliance on foreign oil and increased prosperity for farming communities, must be considered separately, but the greenhouse-gas-mitigation rationale is clearly unsupportable.

  5. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  6. An ergonomic study on the biomechanical consequences in children, generated by the use of computers at school.

    Science.gov (United States)

    Paraizo, Claudia; de Moraes, Anamaria

    2012-01-01

    This research deals with the influence of the computer use in schools related to the children posture, in an ergonomic point of view. The research tries to identify probable causes for the children early postural constraints, relating it to the sedentary behavior and the lack of an ergonomic project in schools. The survey involved 186 children, between 8 and 12 years old, students of a private school in Rio de Janeiro-Brasil. An historical and theoretical school furniture research was conducted as well as a survey with the students and teachers, computer postural evaluation, ergonomic evaluation (RULA method), and observations in the computer classroom. The research dealt with the student's perception with respect to the furniture utilized by him in the classroom during the use of the computer, his body complaint, the time he spent working on the school computer and the possibility of the existence of sedentariness. Also deals with the teachers' perception and knowledge regarding ergonomics with reference to schoolroom furniture and its Regulatory Norms (RN). The purpose of the research work is to highlight the importance of this knowledge, having in view the possibility of the teachers' collaboration in the ergonomic adaptation of the classroom environment and in their conscientious opinion during the purchasing of this furniture. A questionnaire was utilized and its results showed some discontent on the part of the teachers with relation to the schoolroom furniture as well as the teachers' scant knowledge of Ergonomics.We conclude with a survey that despite the children had constraints in postural assessments and school furniture needs a major ergonomic action, the time that children use the computer at school is small compared with the time of use at home and therefore insufficient to be the main cause of quantified commitments, thus the study of computer use at home as a development and continuity of this research.

  7. Personal computer based home automation system

    OpenAIRE

    Hellmuth, George F.

    1993-01-01

    The systems engineering process is applied in the development of the preliminary design of a home automation communication protocol. The objective of the communication protocol is to provide a means for a personal computer to communicate with adapted appliances in the home. A needs analysis is used to ascertain that a need exist for a home automation system. Numerous design alternatives are suggested and evaluated to determine the best possible protocol design. Coaxial cable...

  8. Computer-based literature search in medical institutions in India

    Directory of Open Access Journals (Sweden)

    Kalita Jayantee

    2007-01-01

    Full Text Available Aim: To study the use of computer-based literature search and its application in clinical training and patient care as a surrogate marker of evidence-based medicine. Materials and Methods: A questionnaire comprising of questions on purpose (presentation, patient management, research, realm (site accessed, nature and frequency of search, effect, infrastructure, formal training in computer based literature search and suggestions for further improvement were sent to residents and faculty of a Postgraduate Medical Institute (PGI and a Medical College. The responses were compared amongst different subgroups of respondents. Results: Out of 300 subjects approached 194 responded; of whom 103 were from PGI and 91 from Medical College. There were 97 specialty residents, 58 super-specialty residents and 39 faculty members. Computer-based literature search was done at least once a month by 89% though there was marked variability in frequency and extent. The motivation for computer-based literature search was for presentation in 90%, research in 65% and patient management in 60.3%. The benefit of search was acknowledged in learning and teaching by 80%, research by 65% and patient care by 64.4% of respondents. Formal training in computer based literature search was received by 41% of whom 80% were residents. Residents from PGI did more frequent and more extensive computer-based literature search, which was attributed to better infrastructure and training. Conclusion: Training and infrastructure both are crucial for computer-based literature search, which may translate into evidence based medicine.

  9. A prevalence-based approach to societal costs occurring in consequence of child abuse and neglect

    Science.gov (United States)

    2012-01-01

    Background Traumatization in childhood can result in lifelong health impairment and may have a negative impact on other areas of life such as education, social contacts and employment as well. Despite the frequent occurrence of traumatization, which is reflected in a 14.5 percent prevalence rate of severe child abuse and neglect, the economic burden of the consequences is hardly known. The objective of this prevalence-based cost-of-illness study is to show how impairment of the individual is reflected in economic trauma follow-up costs borne by society as a whole in Germany and to compare the results with other countries’ costs. Methods From a societal perspective trauma follow-up costs were estimated using a bottom-up approach. The literature-based prevalence rate includes emotional, physical and sexual abuse as well as physical and emotional neglect in Germany. Costs are derived from individual case scenarios of child endangerment presented in a German cost-benefit-analysis. A comparison with trauma follow-up costs in Australia, Canada and the USA is based on purchasing power parity. Results The annual trauma follow-up costs total to a margin of EUR 11.1 billion for the lower bound and to EUR 29.8 billion for the upper bound. This equals EUR 134.84 and EUR 363.58, respectively, per capita for the German population. These results conform to the ones obtained from cost studies conducted in Australia (lower bound) and Canada (upper bound), whereas the result for the United States is much lower. Conclusion Child abuse and neglect result in trauma follow-up costs of economically relevant magnitude for the German society. Although the result is well in line with other countries’ costs, the general lack of data should be fought in order to enable more detailed future studies. Creating a reliable cost data basis in the first place can pave the way for long-term cost savings. PMID:23158382

  10. A Computer-Based Visual Analog Scale,

    Science.gov (United States)

    1992-06-01

    34 keys on the computer keyboard or other input device. The initial position of the arrow is always in the center of the scale to prevent biasing the...3 REFERENCES 1. Gift, A.G., "Visual Analogue Scales: Measurement of Subjective Phenomena." Nursing Research, Vol. 38, pp. 286-288, 1989. 2. Ltmdberg...3. Menkes, D.B., Howard, R.C., Spears, G.F., and Cairns, E.R., "Salivary THC Following Cannabis Smoking Correlates With Subjective Intoxication and

  11. Personal Decision Factors Considered by Information Technology Executives: Their Impacts on Business Intentions and Consequent Cloud Computing Services Adoption Rates

    Science.gov (United States)

    Smith, Marcus L., Jr.

    2016-01-01

    During its infancy, the cloud computing industry was the province largely of small and medium-sized business customers. Despite their size, these companies required a professionally run, yet economical information technology (IT) operation. These customers used a total value strategy whereby they avoided paying for essential, yet underutilized,…

  12. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  13. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  14. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  15. Spin-based quantum computation in multielectron quantum dots

    OpenAIRE

    Hu, Xuedong; Sarma, S. Das

    2001-01-01

    In a quantum computer the hardware and software are intrinsically connected because the quantum Hamiltonian (or more precisely its time development) is the code that runs the computer. We demonstrate this subtle and crucial relationship by considering the example of electron-spin-based solid state quantum computer in semiconductor quantum dots. We show that multielectron quantum dots with one valence electron in the outermost shell do not behave simply as an effective single spin system unles...

  16. High Available COTS Based Computer for Space

    Science.gov (United States)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  17. Consequences of population topology for studying gene flow using link-based landscape genetic methods.

    Science.gov (United States)

    van Strien, Maarten J

    2017-07-01

    Many landscape genetic studies aim to determine the effect of landscape on gene flow between populations. These studies frequently employ link-based methods that relate pairwise measures of historical gene flow to measures of the landscape and the geographical distance between populations. However, apart from landscape and distance, there is a third important factor that can influence historical gene flow, that is, population topology (i.e., the arrangement of populations throughout a landscape). As the population topology is determined in part by the landscape configuration, I argue that it should play a more prominent role in landscape genetics. Making use of existing literature and theoretical examples, I discuss how population topology can influence results in landscape genetic studies and how it can be taken into account to improve the accuracy of these results. In support of my arguments, I have performed a literature review of landscape genetic studies published during the first half of 2015 as well as several computer simulations of gene flow between populations. First, I argue why one should carefully consider which population pairs should be included in link-based analyses. Second, I discuss several ways in which the population topology can be incorporated in response and explanatory variables. Third, I outline why it is important to sample populations in such a way that a good representation of the population topology is obtained. Fourth, I discuss how statistical testing for link-based approaches could be influenced by the population topology. I conclude the article with six recommendations geared toward better incorporating population topology in link-based landscape genetic studies.

  18. An Evaluation of the Relative Effectiveness of Function-Based Consequent and Antecedent Interventions in a Preschool Setting

    Science.gov (United States)

    von Schulz, Jonna H.; Dufrene, Brad A.; LaBrot, Zachary C.; Tingstrom, Daniel H.; Olmi, D. Joe; Radley, Keith; Mitchell, Rachel; Maldonado, Aimee

    2018-01-01

    Although there is substantial functional behavioral assessment (FBA) literature suggesting that function-based interventions are effective for improving problem behavior, only a limited number of studies have examined the effectiveness of function-based antecedent versus consequent interventions. Additionally, although there has been a recent…

  19. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  20. Safety distance assessment of industrial toxic releases based on frequency and consequence: A case study in Shanghai, China

    International Nuclear Information System (INIS)

    Yu, Q.; Zhang, Y.; Wang, X.; Ma, W.C.; Chen, L.M.

    2009-01-01

    A case study on the safety distance assessment of a chemical industry park in Shanghai, China, is presented in this paper. Toxic releases were taken into consideration. A safety criterion based on frequency and consequence of major hazard accidents was set up for consequence analysis. The exposure limits for the accidents with the frequency of more than 10 -4 , 10 -5 -10 -4 and 10 -6 -10 -5 per year were mortalities of 1% (or SLOT), 50% (SLOD) and 75% (twice of SLOD) respectively. Accidents with the frequency of less than 10 -6 per year were considered incredible and ignored in the consequence analysis. Taking the safety distance of all the hazard installations in a chemical plant into consideration, the results based on the new criterion were almost smaller than those based on LC50 or SLOD. The combination of the consequence and risk based results indicated that the hazard installations in two of the chemical plants may be dangerous to the protection targets and measurements had to be taken to reduce the risk. The case study showed that taking account of the frequency of occurrence in the consequence analysis would give more feasible safety distances for major hazard accidents and the results were more comparable to those calculated by risk assessment.

  1. Design Guidance for Computer-Based Procedures for Field Workers

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Nearly all activities that involve human interaction with nuclear power plant systems are guided by procedures, instructions, or checklists. Paper-based procedures (PBPs) currently used by most utilities have a demonstrated history of ensuring safety; however, improving procedure use could yield significant savings in increased efficiency, as well as improved safety through human performance gains. The nuclear industry is constantly trying to find ways to decrease human error rates, especially human error rates associated with procedure use. As a step toward the goal of improving field workers’ procedure use and adherence and hence improve human performance and overall system reliability, the U.S. Department of Energy Light Water Reactor Sustainability (LWRS) Program researchers, together with the nuclear industry, have been investigating the possibility and feasibility of replacing current paper-based procedures with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing, depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information relevant for the task and situation at hand, which has potential consequences of taking up valuable time when operators must be responding to the situation, and potentially leading operators down an incorrect response path. Other challenges related to use of PBPs are management of multiple procedures, place-keeping, finding the correct procedure for a task, and relying

  2. Real-time computing of the environmental consequences of an atmospheric accidental release of radioactive material: user's point of view

    International Nuclear Information System (INIS)

    Boeri, G.; Caracciolo, R.; Dickerson, M.

    1985-07-01

    All calculations of the consequences of an atmospheric release must start with atmospheric dispersion calculations. Time factors make external and inhalation dose estimates of immediate concern closely followed by ground contamination of land, pastures and onch agricultural crops. In general, the difficulties in modeling the source term and atmospheric transport and diffusion account for most of the error in calculating the dose to man. Thus, sophisticated treatment of the dose part of the calculating is not usually justified, though the relative distribution of dose in individual organs may be needed for correct decision marking. This paper emphasizes the atmospheric transport and diffusion part of the dose estimate and relates how this calculation can be used to estimate dose. 12 refs

  3. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  4. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended

  5. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  6. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  7. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  8. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  9. A prevalence-based approach to societal costs occurring in consequence of child abuse and neglect

    Directory of Open Access Journals (Sweden)

    Habetha Susanne

    2012-11-01

    Full Text Available Abstract Background Traumatization in childhood can result in lifelong health impairment and may have a negative impact on other areas of life such as education, social contacts and employment as well. Despite the frequent occurrence of traumatization, which is reflected in a 14.5 percent prevalence rate of severe child abuse and neglect, the economic burden of the consequences is hardly known. The objective of this prevalence-based cost-of-illness study is to show how impairment of the individual is reflected in economic trauma follow-up costs borne by society as a whole in Germany and to compare the results with other countries’ costs. Methods From a societal perspective trauma follow-up costs were estimated using a bottom-up approach. The literature-based prevalence rate includes emotional, physical and sexual abuse as well as physical and emotional neglect in Germany. Costs are derived from individual case scenarios of child endangerment presented in a German cost-benefit-analysis. A comparison with trauma follow-up costs in Australia, Canada and the USA is based on purchasing power parity. Results The annual trauma follow-up costs total to a margin of EUR 11.1 billion for the lower bound and to EUR 29.8 billion for the upper bound. This equals EUR 134.84 and EUR 363.58, respectively, per capita for the German population. These results conform to the ones obtained from cost studies conducted in Australia (lower bound and Canada (upper bound, whereas the result for the United States is much lower. Conclusion Child abuse and neglect result in trauma follow-up costs of economically relevant magnitude for the German society. Although the result is well in line with other countries’ costs, the general lack of data should be fought in order to enable more detailed future studies. Creating a reliable cost data basis in the first place can pave the way for long-term cost savings.

  10. Achalasia following reflux disease: coincidence, consequence, or accommodation? An experience-based literature review

    Directory of Open Access Journals (Sweden)

    Vereczkei A

    2017-12-01

    Full Text Available András Vereczkei, Laura Bognár, András Papp, Örs Péter Horváth Department of Surgery, University of Pécs, Pécs, Hungary Abstract: Achalasia is a motility disorder of the esophagus characterized by the defective peristaltic activity of the esophageal body and impaired relaxation of the lower esophageal sphincter due to the degeneration of the inhibitory neurons in the myenteric plexus of the esophageal wall. The histopathological and pathophysiological changes in achalasia have been well described. However, the exact etiological factors leading to the disease still remain unclear. Currently, achalasia is believed to be a multifactorial disease, involving both extrinsic and intrinsic factors. Based on our experience and the review of literature, we believe that gastroesophageal reflux disease (GERD might be one of the triggering factors leading to the development of achalasia. However, it is also stated that the two diseases can simultaneously appear independently from each other. Considering the large number and routine treatment of patients with GERD and achalasia, the rare combination of the two may even remain unnoticed; thus, the analysis of larger patient groups with this entity is not feasible. In this context, we report four cases where long-standing reflux symptoms preceded the development of achalasia. A literature review of the available data is also given. We hypothesize that achalasia following the chronic acid exposure of the esophagus is not accidental but either a consequence of a chronic inflammation or a protective reaction of the organism in order to prevent aspiration and lessen reflux-related symptoms. This hypothesis awaits further clinical confirmation. Keywords: achalasia, gastroesophageal reflux disease, Barrett’s esophagus, Nissen fundoplication

  11. Milestones Toward Majorana-Based Quantum Computing

    Directory of Open Access Journals (Sweden)

    David Aasen

    2016-08-01

    Full Text Available We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1 detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2 validation of a prototype topological qubit, and (3 demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system’s excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  12. Computer Aided Design Parameters for Forward Basing

    Science.gov (United States)

    1988-12-01

    This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is...Base Module of GWN System’s GWN- DTM software. A simple AutoLISP conversion program (TA2DXF, TA2DXB) within the software converts the TA2 format into an

  13. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  14. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  15. Evaluation of Computer-Based Procedure System Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Johanna Oxstrand; Katya Le Blanc; Seth Hays

    2012-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE), performed in close collaboration with industry R&D programs, to provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The introduction of advanced technology in existing nuclear power plants may help to manage the effects of aging systems, structures, and components. In addition, the incorporation of advanced technology in the existing LWR fleet may entice the future workforce, who will be familiar with advanced technology, to work for these utilities rather than more newly built nuclear power plants. Advantages are being sought by developing and deploying technologies that will increase safety and efficiency. One significant opportunity for existing plants to increase efficiency is to phase out the paper-based procedures (PBPs) currently used at most nuclear power plants and replace them, where feasible, with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information

  16. Field microcomputerized multichannel γ ray spectrometer based on notebook computer

    International Nuclear Information System (INIS)

    Jia Wenyi; Wei Biao; Zhou Rongsheng; Li Guodong; Tang Hong

    1996-01-01

    Currently, field γ ray spectrometry can not rapidly measure γ ray full spectrum, so a field microcomputerized multichannel γ ray spectrometer based on notebook computer is described, and the γ ray full spectrum can be rapidly measured in the field

  17. Simple physics-based models of compensatory plant water uptake: concepts and eco-hydrological consequences

    Directory of Open Access Journals (Sweden)

    N. J. Jarvis

    2011-11-01

    Full Text Available Many land surface schemes and simulation models of plant growth designed for practical use employ simple empirical sub-models of root water uptake that cannot adequately reflect the critical role water uptake from sparsely rooted deep subsoil plays in meeting atmospheric transpiration demand in water-limited environments, especially in the presence of shallow groundwater. A failure to account for this so-called "compensatory" water uptake may have serious consequences for both local and global modeling of water and energy fluxes, carbon balances and climate. Some purely empirical compensatory root water uptake models have been proposed, but they are of limited use in global modeling exercises since their parameters cannot be related to measurable soil and vegetation properties. A parsimonious physics-based model of uptake compensation has been developed that requires no more parameters than empirical approaches. This model is described and some aspects of its behavior are illustrated with the help of example simulations. These analyses demonstrate that hydraulic lift can be considered as an extreme form of compensation and that the degree of compensation is principally a function of soil capillarity and the ratio of total effective root length to potential transpiration. Thus, uptake compensation increases as root to leaf area ratios increase, since potential transpiration depends on leaf area. Results of "scenario" simulations for two case studies, one at the local scale (riparian vegetation growing above shallow water tables in seasonally dry or arid climates and one at a global scale (water balances across an aridity gradient in the continental USA, are presented to illustrate biases in model predictions that arise when water uptake compensation is neglected. In the first case, it is shown that only a compensated model can match the strong relationships between water table depth and leaf area and transpiration observed in riparian forest

  18. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  19. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  20. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  1. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  2. An integrated impact assessment and weighting methodology: evaluation of the environmental consequences of computer display technology substitution.

    Science.gov (United States)

    Zhou, Xiaoying; Schoenung, Julie M

    2007-04-01

    Computer display technology is currently in a state of transition, as the traditional technology of cathode ray tubes is being replaced by liquid crystal display flat-panel technology. Technology substitution and process innovation require the evaluation of the trade-offs among environmental impact, cost, and engineering performance attributes. General impact assessment methodologies, decision analysis and management tools, and optimization methods commonly used in engineering cannot efficiently address the issues needed for such evaluation. The conventional Life Cycle Assessment (LCA) process often generates results that can be subject to multiple interpretations, although the advantages of the LCA concept and framework obtain wide recognition. In the present work, the LCA concept is integrated with Quality Function Deployment (QFD), a popular industrial quality management tool, which is used as the framework for the development of our integrated model. The problem of weighting is addressed by using pairwise comparison of stakeholder preferences. Thus, this paper presents a new integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), to assess the environmental behavior of alternative technologies in correlation with their performance and economic characteristics. Computer display technology is used as the case study to further develop our methodology through the modification and integration of various quality management tools (e.g., process mapping, prioritization matrix) and statistical methods (e.g., multi-attribute analysis, cluster analysis). Life cycle thinking provides the foundation for our methodology, as we utilize a published LCA report, which stopped at the characterization step, as our starting point. Further, we evaluate the validity and feasibility of our methodology by considering uncertainty and conducting sensitivity analysis.

  3. Safeguards instrumentation: a computer-based catalog

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available

  4. Safeguards instrumentation: a computer-based catalog

    Energy Technology Data Exchange (ETDEWEB)

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available.

  5. Gender Differences in Consequences of ADHD Symptoms in a Community-Based Organization for Youth

    Science.gov (United States)

    Vitulano, Michael L.; Fite, Paula J.; Wimsatt, Amber R.; Rathert, Jamie L.; Hatmaker, Rebecca S.

    2012-01-01

    Attention-Deficit/Hyperactivity Disorder (ADHD) has been linked to disruptive behavior and disciplinary consequences; however, the variables involved in this process are largely unknown. The current study examined rule-breaking behavior as a mediator of the relation between ADHD symptoms and disciplinary actions 1 year later during after-school…

  6. Environmental decision support system on base of geoinformational technologies for the analysis of nuclear accident consequences

    International Nuclear Information System (INIS)

    Haas, T.C.; Maigan, M.; Arutyunyan, R.V.; Bolshov, L.A.; Demianov, V.V.

    1996-01-01

    The report deals with description of the concept and prototype of environmental decision support system (EDSS) for the analysis of late off-site consequences of severe nuclear accidents and analysis, processing and presentation of spatially distributed radioecological data. General description of the available software, use of modem achievements of geostatistics and stochastic simulations for the analysis of spatial data are presented and discussed

  7. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  8. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  9. Towards a fullerene-based quantum computer

    International Nuclear Information System (INIS)

    Benjamin, Simon C; Ardavan, Arzhang; Briggs, G Andrew D; Britz, David A; Gunlycke, Daniel; Jefferson, John; Jones, Mark A G; Leigh, David F; Lovett, Brendon W; Khlobystov, Andrei N; Lyon, S A; Morton, John J L; Porfyrakis, Kyriakos; Sambrook, Mark R; Tyryshkin, Alexei M

    2006-01-01

    Molecular structures appear to be natural candidates for a quantum technology: individual atoms can support quantum superpositions for long periods, and such atoms can in principle be embedded in a permanent molecular scaffolding to form an array. This would be true nanotechnology, with dimensions of order of a nanometre. However, the challenges of realizing such a vision are immense. One must identify a suitable elementary unit and demonstrate its merits for qubit storage and manipulation, including input/output. These units must then be formed into large arrays corresponding to an functional quantum architecture, including a mechanism for gate operations. Here we report our efforts, both experimental and theoretical, to create such a technology based on endohedral fullerenes or 'buckyballs'. We describe our successes with respect to these criteria, along with the obstacles we are currently facing and the questions that remain to be addressed

  10. Differences in physical-fitness test scores between actively and passively recruited older adults : Consequences for norm-based classification

    NARCIS (Netherlands)

    van Heuvelen, M.J.G.; Stevens, M.; Kempen, G.I.J.M.

    This study investigated differences in physical-fitness test scores between actively and passively recruited older adults and the consequences thereof for norm-based classification of individuals. Walking endurance, grip strength, hip flexibility, balance, manual dexterity, and reaction time were

  11. Alcohol-Related Consequences among First-Year University Students: Effectiveness of a Web-Based Personalized Feedback Program

    Science.gov (United States)

    Doumas, Diana M.; Nelson, Kinsey; DeYoung, Amanda; Renteria, Camryn Conrad

    2014-01-01

    This study evaluated the effectiveness of a web-based personalized feedback program using an objective measure of alcohol-related consequences. Participants were assigned to either the intervention group or an assessment-only control group during university orientation. Sanctions received for campus alcohol policy violations were tracked over the…

  12. The fiscal consequences of ADHD in Germany : a quantitative analysis based on differences in educational attainment and lifetime earnings

    NARCIS (Netherlands)

    Kotsopoulos, Nikolaos; Connolly, Mark P.; Sobanski, Esther; Postma, Maarten J.

    Objective: To estimate the long-term fiscal consequences of attention deficit hyperactivity disorder (ADHD) on the German government and social insurance system based on differences in educational attainment and the resulting differences in lifetime earnings compared with non-ADHD cohorts. Methods:

  13. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  14. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  16. Computational aeroelasticity using a pressure-based solver

    Science.gov (United States)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  17. Fog computing job scheduling optimization based on bees swarm

    Science.gov (United States)

    Bitam, Salim; Zeadally, Sherali; Mellouk, Abdelhamid

    2018-04-01

    Fog computing is a new computing architecture, composed of a set of near-user edge devices called fog nodes, which collaborate together in order to perform computational services such as running applications, storing an important amount of data, and transmitting messages. Fog computing extends cloud computing by deploying digital resources at the premise of mobile users. In this new paradigm, management and operating functions, such as job scheduling aim at providing high-performance, cost-effective services requested by mobile users and executed by fog nodes. We propose a new bio-inspired optimization approach called Bees Life Algorithm (BLA) aimed at addressing the job scheduling problem in the fog computing environment. Our proposed approach is based on the optimized distribution of a set of tasks among all the fog computing nodes. The objective is to find an optimal tradeoff between CPU execution time and allocated memory required by fog computing services established by mobile users. Our empirical performance evaluation results demonstrate that the proposal outperforms the traditional particle swarm optimization and genetic algorithm in terms of CPU execution time and allocated memory.

  18. Spintronic Circuits: The Building Blocks of Spin-Based Computation

    Directory of Open Access Journals (Sweden)

    Roshan Warman

    2016-10-01

    Full Text Available In the most general situation, binary computation is implemented by means of microscopic logical gates known as transistors. According to Moore’s Law, the size of transistors will half every two years, and as these transistors reach their fundamental size limit, the quantum effects of the electrons passing through the transistors will be observed. Due to the inherent randomness of these quantum fluctuations, the basic binary logic will become uncontrollable. This project describes the basic principle governing quantum spin-based computing devices, which may provide an alternative to the conventional solid-state computing devices and circumvent the technological limitations of the current implementation of binary logic.

  19. Nanophotonic quantum computer based on atomic quantum transistor

    International Nuclear Information System (INIS)

    Andrianov, S N; Moiseev, S A

    2015-01-01

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  20. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  1. Nanophotonic quantum computer based on atomic quantum transistor

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, S N [Institute of Advanced Research, Academy of Sciences of the Republic of Tatarstan, Kazan (Russian Federation); Moiseev, S A [Kazan E. K. Zavoisky Physical-Technical Institute, Kazan Scientific Center, Russian Academy of Sciences, Kazan (Russian Federation)

    2015-10-31

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  2. ISAT promises fail-safe computer-based reactor protection

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    AEA Technology's ISAT system is a multiplexed microprocessor-based reactor protection system which has very extensive self-monitoring capabilities and is inherently fail safe. It provides a way of addressing software reliability problems that have tended to hamper widespread introduction of computer-based reactor protection. (author)

  3. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  4. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  5. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  6. On computation of Groebner bases for linear difference systems

    Energy Technology Data Exchange (ETDEWEB)

    Gerdt, Vladimir P. [Laboratory of Information Technologies, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation)]. E-mail: gerdt@jinr.ru

    2006-04-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions.

  7. On computation of Groebner bases for linear difference systems

    International Nuclear Information System (INIS)

    Gerdt, Vladimir P.

    2006-01-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions

  8. Issues in Text Design and Layout for Computer Based Communications.

    Science.gov (United States)

    Andresen, Lee W.

    1991-01-01

    Discussion of computer-based communications (CBC) focuses on issues involved with screen design and layout for electronic text, based on experiences with electronic messaging, conferencing, and publishing within the Australian Open Learning Information Network (AOLIN). Recommendations for research on design and layout for printed text are also…

  9. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  10. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  11. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  12. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-08-27

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field ({approx}10 T) and at low temperature {approx}1 K .

  13. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-01-01

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field (∼10 T) and at low temperature ∼1 K

  14. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  15. Silicon CMOS architecture for a spin-based quantum computer.

    Science.gov (United States)

    Veldhorst, M; Eenink, H G J; Yang, C H; Dzurak, A S

    2017-12-15

    Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits. However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system. The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout. We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing.

  16. Understanding and treating lateral ankle sprains and their consequences: a constraints-based approach.

    Science.gov (United States)

    Wikstrom, Erik A; Hubbard-Turner, Tricia; McKeon, Patrick O

    2013-06-01

    Lateral ankle sprains are a common consequence of physical activity. If not managed appropriately, a cascade of negative alterations to both the joint structure and a person's movement patterns continue to stress the injured ligaments. These alterations result in an individual entering a continuum of disability as evidenced by the ~30 % of ankle sprains that develop into chronic ankle instability (CAI) and up to 78 % of CAI cases that develop into post-traumatic ankle osteoarthritis (OA). Despite this knowledge, no significant improvements in treatment efficacy have been made using traditional treatment paradigms. Therefore, the purpose of this review is to (1) provide an overview of the consequences associated with acute lateral ankle sprains, CAI and post-traumatic ankle OA; (2) introduce the patient-, clinician-, laboratory (PCL)-oriented) model that addresses the lateral ankle sprains and their consequences from a constraints perspective; and (3) introduce the dynamic systems theory as the framework to illustrate how multiple post-injury adaptations create a singular pathology that predisposes individuals with lateral ankle sprains to fall into a continuum of disability. The consequences associated with lateral ankle sprains, CAI and ankle OA are similar and encompass alterations to the structure of the ankle joint (e.g. ligament laxity, positional faults, etc.) and the sensorimotor function responsible for proper ankle joint function (e.g. postural control, gait, etc.). Further, the impairments have been quantified across a range of patient-oriented (e.g. self-report questionnaires), clinician-oriented (e.g. bedside measures of range of motion and postural control), and laboratory-oriented (e.g. arthrometry, gait analysis) outcome measures. The interaction of PCL-oriented outcomes is critically important for understanding the phenomenon of CAI across the continuum of disability. Through the integration of all three sources of evidence, we can clearly see that

  17. Long-term economic consequences of child maltreatment: a population-based study.

    Science.gov (United States)

    Thielen, Frederick W; Ten Have, Margreet; de Graaf, Ron; Cuijpers, Pim; Beekman, Aartjan; Evers, Silvia; Smit, Filip

    2016-12-01

    Child maltreatment is prognostically associated with long-term detrimental consequences for mental health. These consequences are reflected in higher costs due to health service utilization and productivity losses in adulthood. An above-average sense of mastery can have protective effects in the pathogenesis of mental disorders and thus potentially cushion adverse impacts of maltreatment. This should be reflected in lower costs in individuals with a history of child maltreatment and a high sense of mastery. The aims of the study were to prognostically estimate the excess costs of health service uptake and productivity losses in adults with a history of child maltreatment and to evaluate how mastery may act as an effect modifier. Data were used on 5618 individuals participating in the Netherlands Mental Health Survey and Incidence Study (NEMESIS). We focussed on measures of child maltreatment (emotional neglect, physical, psychological and sexual abuse) and economic costs owing to health-care uptake and productivity losses when people with a history of abuse have grown into adulthood. We evaluated how mastery acted as an effect modifier. Estimates were adjusted for demographics and parental psychopathology. Post-stratification weights were used to account for initial non-response and dropout. Due to the non-normal distribution of the costs data, sample errors, 95 % confidence intervals, and p values were calculated using non-parametric bootstrapping (1000 replications). Exposure to child maltreatment occurs frequently (6.9-24.8 %) and is associated with substantial excess costs in adulthood. To illustrate, adjusted annual excess costs attributable to emotional neglect are €1,360 (95 % CI: 615-215) per adult. Mastery showed a significant effect on these figures: annual costs were €1,608 in those with a low sense of mastery, but only €474 in those with a firmer sense of mastery. Child maltreatment has profound mental health consequences and is associated with

  18. Standardized computer-based organized reporting of EEG:SCORE

    DEFF Research Database (Denmark)

    Beniczky, Sandor; H, Aurlien,; JC, Brøgger,

    2013-01-01

    process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice...... in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings....... SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make possible the build-up of a multinational database, and it will help in training young neurophysiologists....

  19. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  20. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  1. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  2. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  3. FitzPatrick Lecture: King George III and the porphyria myth - causes, consequences and re-evaluation of his mental illness with computer diagnostics.

    Science.gov (United States)

    Peters, Timothy

    2015-04-01

    Recent studies have shown that the claim that King George III suffered from acute porphyria is seriously at fault. This article explores some of the causes of this misdiagnosis and the consequences of the misleading claims, also reporting on the nature of the king's recurrent mental illness according to computer diagnostics. In addition, techniques of cognitive archaeology are used to investigate the nature of the king's final decade of mental illness, which resulted in the appointment of the Prince of Wales as Prince Regent. The results of this analysis confirm that the king suffered from bipolar disorder type I, with a final decade of dementia, due, in part, to the neurotoxicity of his recurrent episodes of acute mania. © 2015 Royal College of Physicians.

  4. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  5. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  6. Remote media vision-based computer input device

    Science.gov (United States)

    Arabnia, Hamid R.; Chen, Ching-Yi

    1991-11-01

    In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.

  7. Development of web-based Off-Site Consequence Analysis Program (OSCAP) for extending ILRT intervals and its application

    International Nuclear Information System (INIS)

    Jeon, Ho-Jun; Hwang, Seok-Won; Oh, Ji-Yong

    2012-01-01

    Highlights: ► We develop web-based offsite consequence analysis program based on MACCS II code. ► The program has an automatic processing module to make the main input data. ► It is effective in conducting risk assessments according to extending ILRT intervals. ► Even a beginner can perform offsite consequence analysis with the program. - Abstract: For an offsite consequence analysis, MELCOR Accident Consequence Code System (MACCS) II code is widely used as a tool. In this study, the algorithm of web-based Off-Site Consequence Analysis Program (OSCAP) using the MACCS II code was developed for an integrated leak rate test (ILRT) interval extension and Level 3 probabilistic safety assessment (PSA), and verification and validation (V and V) of the program was performed. The main input data of the MACCS II code are meteorological data, population distribution data and source term data. However, it requires lots of time and efforts to generate the main input data for an offsite consequence analysis using the MACCS II code. For example, the meteorological data are collected from each nuclear power site in real time, but the formats of the raw data collected are different from each other as a site. To reduce efforts and time for risk assessments, the web-based OSCAP has an automatic processing module which converts the format of the raw data collected from each site in Korea to the input data format of the MACCS II code. The program also provides an automatic function of converting the latest population data from Statistics Korea, the National Statistical Office, to the population distribution input data format of the MACCS II code. In case of the source term data, the program includes the release fraction of each source term category resulting from Modular Accident Analysis Program (MAAP) code analysis and the core inventory data from ORIGEN code analysis. These analysis results of each plant in Korea are stored in a database module of the web-based OSCAP, so a

  8. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  9. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  10. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  11. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  12. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  13. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  14. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  15. Research Note: The consequences of different methods for handling missing network data in Stochastic Actor Based Models.

    Science.gov (United States)

    Hipp, John R; Wang, Cheng; Butts, Carter T; Jose, Rupa; Lakon, Cynthia M

    2015-05-01

    Although stochastic actor based models (e.g., as implemented in the SIENA software program) are growing in popularity as a technique for estimating longitudinal network data, a relatively understudied issue is the consequence of missing network data for longitudinal analysis. We explore this issue in our research note by utilizing data from four schools in an existing dataset (the AddHealth dataset) over three time points, assessing the substantive consequences of using four different strategies for addressing missing network data. The results indicate that whereas some measures in such models are estimated relatively robustly regardless of the strategy chosen for addressing missing network data, some of the substantive conclusions will differ based on the missing data strategy chosen. These results have important implications for this burgeoning applied research area, implying that researchers should more carefully consider how they address missing data when estimating such models.

  16. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  17. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  18. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  19. Computer-Based Training Programs for Older People with Mild Cognitive Impairment and/or Dementia

    Directory of Open Access Journals (Sweden)

    Blanka Klimova

    2017-05-01

    Full Text Available Currently, due to the demographic trends, the number of aging population groups is dramatically rising, especially in developed countries. This trend causes serious economic and social issues, but also an increase of aging disorders such as mild cognitive impairment (MCI or dementia in older population groups. MCI and dementia are connected with deterioration of cognitive functions. The aim of this mini review article is therefore to explore whether computer-based training programs might be an effective intervention tool for older people with MCI and/or dementia or not. The methods include a literature search in the world’s acknowledged databases: Web of Science, Scopus, Science Direct, MEDLINE and Springer, and consequently, evaluation of the findings of the relevant studies. The findings from the selected studies are quite neutral with respect to the efficacy of the computer assisted intervention programs on the improvement of basic cognitive functions. On the one hand, they suggest that the computer-based training interventions might generate some positive effects on patients with MCI and/or dementia, such as the improvement of learning and short-term memory, as well as behavioral symptoms. On the other hand, these training interventions seem to be short-term, with small sample sizes and their efficacy was proved only in the half of the detected studies. Therefore more longitudinal randomized controlled trials (RCTs are needed to prove the efficacy of the computer-based training programs among older individuals with MCI and/or dementia.

  20. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  1. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  2. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    Krammen, J.

    1985-01-01

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  3. An Intelligent Computer-Based System for Sign Language Tutoring

    Science.gov (United States)

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  4. Activity-based computing for medical work in hospitals

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2009-01-01

    principles, the Java-based implementation of the ABC Framework, and an experimental evaluation together with a group of hospital clinicians. The article contributes to the growing research on support for human activities, mobility, collaboration, and context-aware computing. The ABC Framework presents...

  5. Students' Motivation toward Computer-Based Language Learning

    Science.gov (United States)

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  6. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  7. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  8. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  9. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  10. The Use of Audio and Animation in Computer Based Instruction.

    Science.gov (United States)

    Koroghlanian, Carol; Klein, James D.

    This study investigated the effects of audio, animation, and spatial ability in a computer-based instructional program for biology. The program presented instructional material via test or audio with lean text and included eight instructional sequences presented either via static illustrations or animations. High school students enrolled in a…

  11. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  12. The Accuracy of Cognitive Monitoring during Computer-Based Instruction.

    Science.gov (United States)

    Garhart, Casey; Hannafin, Michael J.

    This study was conducted to determine the accuracy of learners' comprehension monitoring during computer-based instruction and to assess the relationship between enroute monitoring and different levels of learning. Participants were 50 university undergraduate students enrolled in an introductory educational psychology class. All students received…

  13. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  14. The use of computer based instructions to enhance Rwandan ...

    African Journals Online (AJOL)

    Annestar

    (2) To what extent the newly acquired ICT skills impact on teachers' competency? (3) How suitable is computer based instruction to enhance teachers' continuous professional development? Literature review. ICT competency for teachers. Regardless of the quantity and quality of technology available in classrooms, the key ...

  15. ORGANIZATION OF CLOUD COMPUTING INFRASTRUCTURE BASED ON SDN NETWORK

    Directory of Open Access Journals (Sweden)

    Alexey A. Efimenko

    2013-01-01

    Full Text Available The article presents the main approaches to cloud computing infrastructure based on the SDN network in present data processing centers (DPC. The main indexes of management effectiveness of network infrastructure of DPC are determined. The examples of solutions for the creation of virtual network devices are provided.

  16. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst and at ...

  17. A computer-based teaching programme (CBTP) developed for ...

    African Journals Online (AJOL)

    The nursing profession, like other professions, is focused on preparing students for practice, and particular attention must be paid to the ability of student nurses to extend their knowledge and to solve nursing care problems effectively. A computer-based teaching programme (CBTP) for clinical practice to achieve these ...

  18. Evaluation of computer-based library services at Kenneth Dike ...

    African Journals Online (AJOL)

    This study evaluated computer-based library services/routines at Kenneth Dike Library, University of Ibadan. Four research questions were developed and answered. A survey research design was adopted; using questionnaire as the instrument for data collection. A total of 200 respondents randomly selected from 10 ...

  19. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  20. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  1. Computer Based Asset Management System For Commercial Banks

    Directory of Open Access Journals (Sweden)

    Amanze

    2015-08-01

    Full Text Available ABSTRACT The Computer-based Asset Management System is a web-based system. It allows commercial banks to keep track of their assets. The most advantages of this system are the effective management of asset by keeping records of the asset and retrieval of information. In this research I gather the information to define the requirements of the new application and look at factors how commercial banks managed their asset.

  2. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  3. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  4. Do modified audit opinions have economic consequences? Empirical evidence based on financial constraints

    Directory of Open Access Journals (Sweden)

    Zhiwei Lin

    2011-09-01

    Full Text Available We present a framework and empirical evidence to explain why, on average, 11% of listed firms in China received modified audit opinions (MAOs between 1992 and 2009. We argue that there are two reasons for this phenomenon: strong earnings management incentives lower firms’ financial reporting quality and soft budget constraints weaken the information and governance roles of audit opinions. We find that firms’ financial constraints eased after receiving MAOs, which suggests that MAOs have limited economic consequences. Further analysis shows that this phenomenon predominantly exists in government-controlled firms and firms that receive MAOs for the first time. We also find that MAOs have not influenced financial constraints after 2006. Finally, we find that MAOs did not affect borrowing cash flows from banks until 2005, suggesting that MAOs did not start affecting bank financing until that year. We also find that firms receive more related-party financing after receiving MAOs. Our results indicate that a limited effect on bank financing and increased related-party financing reduce the effect of MAOs on financial constraints.

  5. Quantum computing based on space states without charge transfer

    International Nuclear Information System (INIS)

    Vyurkov, V.; Filippov, S.; Gorelik, L.

    2010-01-01

    An implementation of a quantum computer based on space states in double quantum dots is discussed. There is no charge transfer in qubits during a calculation, therefore, uncontrolled entanglement between qubits due to long-range Coulomb interaction is suppressed. Encoding and processing of quantum information is merely performed on symmetric and antisymmetric states of the electron in double quantum dots. Other plausible sources of decoherence caused by interaction with phonons and gates could be substantially suppressed in the structure as well. We also demonstrate how all necessary quantum logic operations, initialization, writing, and read-out could be carried out in the computer.

  6. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  7. Environmental sciences and computations: a modular data based systems approach

    International Nuclear Information System (INIS)

    Crawford, T.V.; Bailey, C.E.

    1975-07-01

    A major computer code for environmental calculations is under development at the Savannah River Laboratory. The primary aim is to develop a flexible, efficient capability to calculate, for all significant pathways, the dose to man resulting from releases of radionuclides from the Savannah River Plant and from other existing and potential radioactive sources in the southeastern United States. The environmental sciences programs at SRP are described, with emphasis on the development of the calculational system. It is being developed as a modular data-based system within the framework of the larger JOSHUA Computer System, which provides data management, terminal, and job execution facilities. (U.S.)

  8. An Interactive Computer-Based Circulation System: Design and Development

    Directory of Open Access Journals (Sweden)

    James S. Aagaard

    1972-03-01

    Full Text Available An on-line computer-based circulation control system has been installed at the Northwestern University library. Features of the system include self-service book charge, remote terminal inquiry and update, and automatic production of notices for call-ins and books available. Fine notices are also prepared daily and overdue notices weekly. Important considerations in the design of the system were to minimize costs of operation and to include technical services functions eventually. The system operates on a relatively small computer in a multiprogrammed mode.

  9. Connection machine: a computer architecture based on cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hillis, W D

    1984-01-01

    This paper describes the connection machine, a programmable computer based on cellular automata. The essential idea behind the connection machine is that a regular locally-connected cellular array can be made to behave as if the processing cells are connected into any desired topology. When the topology of the machine is chosen to match the topology of the application program, the result is a fast, powerful computing engine. The connection machine was originally designed to implement knowledge retrieval operations in artificial intelligence programs, but the hardware and the programming techniques are apparently applicable to a much larger class of problems. A machine with 100000 processing cells is currently being constructed. 27 references.

  10. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  11. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  12. Problem based learning in Higher Education and new approaches to assessment as a consequence of new formal regulations

    DEFF Research Database (Denmark)

    Jensen, Annie Aarup; Krogh, Lone

    on the basis of the project report in order to ensure alignment between goals, learning activities and assessment form. However, in 2006 the government announced that group examinations would no longer be permitted. As a result students are now allowed to do study work and write reports in groups......, but they are to be examined and assessed individually, i.e. without the presence of other students from the group. In this paper, we will investigate some of the consequences of these new regulations for   assessment. The research questions will address the question of alignment between group study and individual examination...... of group exams. Based on selected theoretical approaches to teaching, learning and assessment we wish to discuss the result of our research, the consequences of the changes in assessment forms, as well as the measures taken at the university in order to obtain new valid assessment forms. Finally...

  13. The cause-consequence data base: a retrieval system for records pertaining to accident management

    International Nuclear Information System (INIS)

    Kumamoto, H.; Inoue, K.; Sawaragi, Y.

    1981-01-01

    This paper describes a proposal to store in a data base important paragraphs from reports of investigations into many types of accidents. The data base is to handle not only reports on TMI, but also reports on other events at nuclear reactors, chemical plant explosions, earthquakes, hurricanes, fires, and so forth. (author)

  14. Beyond Promotion-Based Store Switching : Antecedents and Consequences of Systematic Multiple-Store Shopping

    NARCIS (Netherlands)

    Gijsbrechts, E.; Campo, K.; Nisol, P.

    2005-01-01

    In this paper, we demonstrate that single-purpose multiple store shopping is not only driven by opportunistic, promotion-based motivations, but may also be part of a longer term shopping planning process based on stable store characteristics.Starting from a utility-maximizing shopping behavior

  15. Sensitivity Analysis of an Agent-Based Model of Culture's Consequences for Trade

    NARCIS (Netherlands)

    Burgers, S.L.G.E.; Jonker, C.M.; Hofstede, G.J.; Verwaart, D.

    2010-01-01

    This paper describes the analysis of an agent-based model’s sensitivity to changes in parameters that describe the agents’ cultural background, relational parameters, and parameters of the decision functions. As agent-based models may be very sensitive to small changes in parameter values, it is of

  16. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  17. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  18. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  19. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  20. Template based parallel checkpointing in a massively parallel computer system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Inglett, Todd Alan [Rochester, MN

    2009-01-13

    A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.

  1. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  2. Environmental Consequences of an Industry Based on Harvesting the Wild Desert Shrub Jojoba.

    Science.gov (United States)

    Foster, Kennith E.

    1980-01-01

    Described are the economic and agricultural issues surrounding the cultivation of desert plants, principally the jojoba, as a source of fuel. The article examines the environmental impacts of an industry based on arid-region cultivation of such plants. (RE)

  3. Beyond Promotion-Based Store Switching: Antecedents and Consequences of Systematic Multiple-Store Shopping

    OpenAIRE

    Gijsbrechts, E.; Campo, K.; Nisol, P.

    2005-01-01

    In this paper, we demonstrate that single-purpose multiple store shopping is not only driven by opportunistic, promotion-based motivations, but may also be part of a longer term shopping planning process based on stable store characteristics.Starting from a utility-maximizing shopping behavior model, we find that consumers systematically visit multiple stores to take advantage of two types of store complementarity.With 'fixed cost complementarity', consumers alternate visits to highly preferr...

  4. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  5. Trend of computer-based console for nuclear power plants

    International Nuclear Information System (INIS)

    Wajima, Tsunetaka; Serizawa, Michiya

    1975-01-01

    The amount of informations to be watched by the operators in the central operation room increased with the increase of the capacity of nuclear power generation plants, and the necessity of computer-based consoles, in which the informations are compiled and the rationalization of the interface between the operators and the plants is intended by introducing CRT displays and process computers, became to be recognized. The integrated monitoring and controlling system is explained briefly by taking Dungeness B Nuclear Power Station in Britain as a typical example. This power station comprises two AGRs, and these two plants can be controlled in one central control room, each by one man. Three computers including stand-by one are installed. Each computer has the core memory of 16 K words (24 bits/word), and 4 magnetic drums of 256 K words are installed as the external memory. The peripheral equipments are 12 CRT displays, 6 typewriters, high speed tape reader and tape punch for each plant. The display and record of plant data, the analysis, display and record of alarms, the control of plants including reactors, and post incident record are assigned to the computers. In Hitachi Ltd. in Japan, the introduction of color CRTs, the developments of operating consoles, new data-accessing method, and the consoles for maintenance management are in progress. (Kako, I.)

  6. Biological consequences of potential repair intermediates of clustered base damage site in Escherichia coli

    Energy Technology Data Exchange (ETDEWEB)

    Shikazono, Naoya, E-mail: shikazono.naoya@jaea.go.jp [Japan Atomic Energy Agency, Advanced Research Science Center, 2-4 Shirakata-Shirane, Tokai-mura, Naka-gun, Ibaraki 319-1195 (Japan); O' Neill, Peter [Gray Institute for Radiation Oncology and Biology, University of Oxford, Roosevelt Drive, Oxford OX3 7DQ (United Kingdom)

    2009-10-02

    Clustered DNA damage induced by a single radiation track is a unique feature of ionizing radiation. Using a plasmid-based assay in Escherichia coli, we previously found significantly higher mutation frequencies for bistranded clusters containing 7,8-dihydro-8-oxoguanine (8-oxoG) and 5,6-dihydrothymine (DHT) than for either a single 8-oxoG or a single DHT in wild type and in glycosylase-deficient strains of E. coli. This indicates that the removal of an 8-oxoG from a clustered damage site is most likely retarded compared to the removal of a single 8-oxoG. To gain further insights into the processing of bistranded base lesions, several potential repair intermediates following 8-oxoG removal were assessed. Clusters, such as DHT + apurinic/apyrimidinic (AP) and DHT + GAP have relatively low mutation frequencies, whereas clusters, such as AP + AP or GAP + AP, significantly reduce the number of transformed colonies, most probably through formation of a lethal double strand break (DSB). Bistranded AP sites placed 3' to each other with various interlesion distances also blocked replication. These results suggest that bistranded base lesions, i.e., single base lesions on each strand, but not clusters containing only AP sites and strand breaks, are repaired in a coordinated manner so that the formation of DSBs is avoided. We propose that, when either base lesion is initially excised from a bistranded base damage site, the remaining base lesion will only rarely be converted into an AP site or a single strand break in vivo.

  7. Biological consequences of potential repair intermediates of clustered base damage site in Escherichia coli

    International Nuclear Information System (INIS)

    Shikazono, Naoya; O'Neill, Peter

    2009-01-01

    Clustered DNA damage induced by a single radiation track is a unique feature of ionizing radiation. Using a plasmid-based assay in Escherichia coli, we previously found significantly higher mutation frequencies for bistranded clusters containing 7,8-dihydro-8-oxoguanine (8-oxoG) and 5,6-dihydrothymine (DHT) than for either a single 8-oxoG or a single DHT in wild type and in glycosylase-deficient strains of E. coli. This indicates that the removal of an 8-oxoG from a clustered damage site is most likely retarded compared to the removal of a single 8-oxoG. To gain further insights into the processing of bistranded base lesions, several potential repair intermediates following 8-oxoG removal were assessed. Clusters, such as DHT + apurinic/apyrimidinic (AP) and DHT + GAP have relatively low mutation frequencies, whereas clusters, such as AP + AP or GAP + AP, significantly reduce the number of transformed colonies, most probably through formation of a lethal double strand break (DSB). Bistranded AP sites placed 3' to each other with various interlesion distances also blocked replication. These results suggest that bistranded base lesions, i.e., single base lesions on each strand, but not clusters containing only AP sites and strand breaks, are repaired in a coordinated manner so that the formation of DSBs is avoided. We propose that, when either base lesion is initially excised from a bistranded base damage site, the remaining base lesion will only rarely be converted into an AP site or a single strand break in vivo.

  8. Parental intimate partner homicide and its consequences for children : protocol for a population-based study

    NARCIS (Netherlands)

    Alisic, Eva; Groot, Arend; Snetselaar, Hanneke; Stroeken, Tielke; van de Putte, Elise

    2015-01-01

    Background: The loss of a parent due to intimate partner homicide has a major impact on children. Professionals involved have to make far-reaching decisions regarding placement, guardianship, mental health care and contact with the perpetrating parent, without an evidence base to guide these

  9. The antecedents and consequences of restrictive age-based ratings in the global motion picture industry

    NARCIS (Netherlands)

    Leenders, M.A.A.M.; Eliashberg, J.

    2011-01-01

    This article analyzes one key characteristic shared by a growing number of industries. Specifically, their products and services are continuously monitored and evaluated by local third-party ratings systems. In this study, we focus on understanding the local drivers of restrictive age-based ratings

  10. Gender Consequences of a National Performance-Based Funding Model: New Pieces in an Old Puzzle

    Science.gov (United States)

    Nielsen, Mathias Wullum

    2017-01-01

    This article investigates the extent to which the Danish "Bibliometric Research Indicator" (BRI) reflects the performance of men and women differently. The model is based on a differentiated counting of peer-reviewed publications, awarding three and eight points for contributions to "well-regarded" and highly selective journals…

  11. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services. Monograph

    Science.gov (United States)

    Stecher, Brian M.; Camm, Frank; Damberg, Cheryl L.; Hamilton, Laura S.; Mullen, Kathleen J.; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L.

    2010-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little…

  12. Evaluation of Project Chrysalis: A School-based Intervention To Reduce Negative Consequences of Abuse.

    Science.gov (United States)

    Brown, Kelly J.; Block, Audrey J.

    2001-01-01

    Evaluated a school-based program that served female adolescents with histories of physical, sexual, or emotional abuse. Found that participation produced healthier beliefs and attitudes about alcohol and other drug use and reduced initiation of tobacco and marijuana use. Findings support enrolling younger girls before they develop negative…

  13. Industrial application of a graphics computer-based training system

    International Nuclear Information System (INIS)

    Klemm, R.W.

    1985-01-01

    Graphics Computer Based Training (GCBT) roles include drilling, tutoring, simulation and problem solving. Of these, Commonwealth Edison uses mainly tutoring, simulation and problem solving. These roles are not separate in any particular program. They are integrated to provide tutoring and part-task simulation, part-task simulation and problem solving, or problem solving tutoring. Commonwealth's Graphics Computer Based Training program was a result of over a year's worth of research and planning. The keys to the program are it's flexibility and control. Flexibility is maintained through stand alone units capable of program authoring and modification for plant/site specific users. Yet, the system has the capability to support up to 31 terminals with a 40 mb hard disk drive. Control of the GCBT program is accomplished through establishment of development priorities and a central development facility (Commonwealth Edison's Production Training Center)

  14. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  15. Security personnel training using a computer-based game

    International Nuclear Information System (INIS)

    Ralph, J.; Bickner, L.

    1987-01-01

    Security personnel training is an integral part of a total physical security program, and is essential in enabling security personnel to perform their function effectively. Several training tools are currently available for use by security supervisors, including: textbook study, classroom instruction, and live simulations. However, due to shortcomings inherent in each of these tools, a need exists for the development of low-cost alternative training methods. This paper discusses one such alternative: a computer-based, game-type security training system. This system would be based on a personal computer with high-resolution graphics. Key features of this system include: a high degree of realism; flexibility in use and maintenance; high trainee motivation; and low cost

  16. Could one make a diamond-based quantum computer?

    International Nuclear Information System (INIS)

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-01-01

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  17. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  18. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  19. A scalable PC-based parallel computer for lattice QCD

    International Nuclear Information System (INIS)

    Fodor, Z.; Katz, S.D.; Pappa, G.

    2003-01-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eoetvoes Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered (wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop

  20. A scalable PC-based parallel computer for lattice QCD

    International Nuclear Information System (INIS)

    Fodor, Z.; Papp, G.

    2002-09-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eoetvoes Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7 GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered(wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop. (orig.)

  1. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  2. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  3. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  4. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  5. Gamma spectrometric system based on the personal computer Pravetz-83

    International Nuclear Information System (INIS)

    Yanakiev, K; Grigorov, T.; Vuchkov, M.

    1985-01-01

    A gamma spectrometric system based on a personal microcomputer Pravets-85 is described. The analog modules are NIM standard. ADC data are stored in the memory of the computer via a DMA channel and a real-time data processing is possible. The results from a series of tests indicate that the performance of the system is comparable with that of comercially avalable computerized spectrometers Ortec and Canberra

  6. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  7. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    OpenAIRE

    Cheremisina Lyubov

    2014-01-01

    This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the s...

  8. Why advanced computing? The key to space-based operations

    Science.gov (United States)

    Phister, Paul W., Jr.; Plonisch, Igor; Mineo, Jack

    2000-11-01

    The 'what is the requirement?' aspect of advanced computing and how it relates to and supports Air Force space-based operations is a key issue. In support of the Air Force Space Command's five major mission areas (space control, force enhancement, force applications, space support and mission support), two-fifths of the requirements have associated stringent computing/size implications. The Air Force Research Laboratory's 'migration to space' concept will eventually shift Science and Technology (S&T) dollars from predominantly airborne systems to airborne-and-space related S&T areas. One challenging 'space' area is in the development of sophisticated on-board computing processes for the next generation smaller, cheaper satellite systems. These new space systems (called microsats or nanosats) could be as small as a softball, yet perform functions that are currently being done by large, vulnerable ground-based assets. The Joint Battlespace Infosphere (JBI) concept will be used to manage the overall process of space applications coupled with advancements in computing. The JBI can be defined as a globally interoperable information 'space' which aggregates, integrates, fuses, and intelligently disseminates all relevant battlespace knowledge to support effective decision-making at all echelons of a Joint Task Force (JTF). This paper explores a single theme -- on-board processing is the best avenue to take advantage of advancements in high-performance computing, high-density memories, communications, and re-programmable architecture technologies. The goal is to break away from 'no changes after launch' design to a more flexible design environment that can take advantage of changing space requirements and needs while the space vehicle is 'on orbit.'

  9. An endohedral fullerene-based nuclear spin quantum computer

    International Nuclear Information System (INIS)

    Ju Chenyong; Suter, Dieter; Du Jiangfeng

    2011-01-01

    We propose a new scalable quantum computer architecture based on endohedral fullerene molecules. Qubits are encoded in the nuclear spins of the endohedral atoms, which posses even longer coherence times than the electron spins which are used as the qubits in previous proposals. To address the individual qubits, we use the hyperfine interaction, which distinguishes two modes (active and passive) of the nuclear spin. Two-qubit quantum gates are effectively implemented by employing the electronic dipolar interaction between adjacent molecules. The electron spins also assist in the qubit initialization and readout. Our architecture should be significantly easier to implement than earlier proposals for spin-based quantum computers, such as the concept of Kane [B.E. Kane, Nature 393 (1998) 133]. - Research highlights: → We propose an endohedral fullerene-based scalable quantum computer architecture. → Qubits are encoded on nuclear spins, while electron spins serve as auxiliaries. → Nuclear spins are individually addressed using the hyperfine interaction. → Two-qubit gates are implemented through the medium of electron spins.

  10. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  11. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  12. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  13. Using computer-based training to facilitate radiation protection review

    International Nuclear Information System (INIS)

    Abercrombie, J.S.; Copenhaver, E.D.

    1989-01-01

    In a national laboratory setting, it is necessary to provide radiation protection overview and training to diverse parts of the laboratory population. This includes employees at research reactors, accelerators, waste facilities, radiochemical isotope processing, and analytical laboratories, among others. In addition, our own radiation protection and monitoring staffs must be trained. To assist in the implementation of this full range of training, ORNL has purchased prepackaged computer-based training in health physics and technical mathematics with training modules that can be selected from many topics. By selection of specific modules, appropriate radiation protection review packages can be determined to meet many individual program needs. Because our radiation protection personnel must have some previous radiation protection experience or the equivalent of an associate's degree in radiation protection for entry level, the computer-based training will serve primarily as review of major principles. Others may need very specific prior training to make the computer-based training effective in their work situations. 4 refs

  14. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  15. Plancton: an opportunistic distributed computing project based on Docker containers

    Science.gov (United States)

    Concas, Matteo; Berzano, Dario; Bagnasco, Stefano; Lusso, Stefano; Masera, Massimo; Puccio, Maximiliano; Vallero, Sara

    2017-10-01

    The computing power of most modern commodity computers is far from being fully exploited by standard usage patterns. In this work we describe the development and setup of a virtual computing cluster based on Docker containers used as worker nodes. The facility is based on Plancton: a lightweight fire-and-forget background service. Plancton spawns and controls a local pool of Docker containers on a host with free resources, by constantly monitoring its CPU utilisation. It is designed to release the resources allocated opportunistically, whenever another demanding task is run by the host user, according to configurable policies. This is attained by killing a number of running containers. One of the advantages of a thin virtualization layer such as Linux containers is that they can be started almost instantly upon request. We will show how fast the start-up and disposal of containers eventually enables us to implement an opportunistic cluster based on Plancton daemons without a central control node, where the spawned Docker containers behave as job pilots. Finally, we will show how Plancton was configured to run up to 10 000 concurrent opportunistic jobs on the ALICE High-Level Trigger facility, by giving a considerable advantage in terms of management compared to virtual machines.

  16. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  17. [Card-based age control mechanisms at tobacco vending machines. Effect and consequences].

    Science.gov (United States)

    Schneider, S; Meyer, C; Löber, S; Röhrig, S; Solle, D

    2010-02-01

    Until recently, 700,000 tobacco vending machines provided uncontrolled access to cigarettes for children and adolescents in Germany. On January 1, 2007, a card-based electronic locking device was attached to all tobacco vending machines to prevent the purchase of cigarettes by children and adolescents under 16. Starting in 2009, only persons older than 18 are able to buy cigarettes from tobacco vending machines. The aim of the present investigation (SToP Study: "Sources of Tobacco for Pupils" Study) was to assess changes in the number of tobacco vending machines after the introduction of these new technical devices (supplier's reaction). In addition, the ways smoking adolescents make purchases were assessed (consumer's reaction). We registered and mapped the total number of tobacco points of sale (tobacco POS) before and after the introduction of the card-based electronic locking device in two selected districts of the city of Cologne. Furthermore, pupils from local schools (response rate: 83%) were asked about their tobacco consumption and ways of purchase using a questionnaire. Results indicated that in the area investigated the total number of tobacco POSs decreased from 315 in 2005 to 277 in 2007. The rates of decrease were 48% for outdoor vending machines and 8% for indoor vending machines. Adolescents reported circumventing the card-based electronic locking devices (e.g., by using cards from older friends) and using other tobacco POSs (especially newspaper kiosks) or relying on their social network (mainly friends). The decreasing number of tobacco vending machines has not had a significant impact on cigarette acquisition by adolescent smokers as they tend to circumvent the newly introduced security measures.

  18. Hypoxia and Its Acid-Base Consequences: From Mountains to Malignancy.

    Science.gov (United States)

    Swenson, Erik R

    Hypoxia, depending upon its magnitude and circumstances, evokes a spectrum of mild to severe acid-base changes ranging from alkalosis to acidosis, which can alter many responses to hypoxia at both non-genomic and genomic levels, in part via altered hypoxia-inducible factor (HIF) metabolism. Healthy people at high altitude and persons hyperventilating to non-hypoxic stimuli can become alkalotic and alkalemic with arterial pH acutely rising as high as 7.7. Hypoxia-mediated respiratory alkalosis reduces sympathetic tone, blunts hypoxic pulmonary vasoconstriction and hypoxic cerebral vasodilation, and increases hemoglobin oxygen affinity. These effects and others can be salutary or counterproductive to tissue oxygen delivery and utilization, based upon magnitude of each effect and summation. With severe hypoxia either in the setting of profound arterial hemoglobin desaturation and reduced O2 content or poor perfusion (ischemia) at the global or local level, metabolic and hypercapnic acidosis develop along with considerable lactate formation and pH falling to below 6.8. Although conventionally considered to be injurious and deleterious to cell function and survival, both acidoses may be cytoprotective by various anti-inflammatory, antioxidant, and anti-apoptotic mechanisms which limit total hypoxic or ischemic-reperfusion injury. Attempts to correct acidosis by giving bicarbonate or other alkaline agents under these circumstances ahead of or concurrent with reoxygenation efforts may be ill advised. Better understanding of this so-called "pH paradox" or permissive acidosis may offer therapeutic possibilities. Rapidly growing cancers often outstrip their vascular supply compromising both oxygen and nutrient delivery and metabolic waste disposal, thus limiting their growth and metastatic potential. However, their excessive glycolysis and lactate formation may not necessarily represent oxygen insufficiency, but rather the Warburg effect-an attempt to provide a large amount

  19. Organization of the secure distributed computing based on multi-agent system

    Science.gov (United States)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  20. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  1. Industrial Personal Computer based Display for Nuclear Safety System

    International Nuclear Information System (INIS)

    Kim, Ji Hyeon; Kim, Aram; Jo, Jung Hee; Kim, Ki Beom; Cheon, Sung Hyun; Cho, Joo Hyun; Sohn, Se Do; Baek, Seung Min

    2014-01-01

    The safety display of nuclear system has been classified as important to safety (SIL:Safety Integrity Level 3). These days the regulatory agencies are imposing more strict safety requirements for digital safety display system. To satisfy these requirements, it is necessary to develop a safety-critical (SIL 4) grade safety display system. This paper proposes industrial personal computer based safety display system with safety grade operating system and safety grade display methods. The description consists of three parts, the background, the safety requirements and the proposed safety display system design. The hardware platform is designed using commercially available off-the-shelf processor board with back plane bus. The operating system is customized for nuclear safety display application. The display unit is designed adopting two improvement features, i.e., one is to provide two separate processors for main computer and display device using serial communication, and the other is to use Digital Visual Interface between main computer and display device. In this case the main computer uses minimized graphic functions for safety display. The display design is at the conceptual phase, and there are several open areas to be concreted for a solid system. The main purpose of this paper is to describe and suggest a methodology to develop a safety-critical display system and the descriptions are focused on the safety requirement point of view

  2. Industrial Personal Computer based Display for Nuclear Safety System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Hyeon; Kim, Aram; Jo, Jung Hee; Kim, Ki Beom; Cheon, Sung Hyun; Cho, Joo Hyun; Sohn, Se Do; Baek, Seung Min [KEPCO, Youngin (Korea, Republic of)

    2014-08-15

    The safety display of nuclear system has been classified as important to safety (SIL:Safety Integrity Level 3). These days the regulatory agencies are imposing more strict safety requirements for digital safety display system. To satisfy these requirements, it is necessary to develop a safety-critical (SIL 4) grade safety display system. This paper proposes industrial personal computer based safety display system with safety grade operating system and safety grade display methods. The description consists of three parts, the background, the safety requirements and the proposed safety display system design. The hardware platform is designed using commercially available off-the-shelf processor board with back plane bus. The operating system is customized for nuclear safety display application. The display unit is designed adopting two improvement features, i.e., one is to provide two separate processors for main computer and display device using serial communication, and the other is to use Digital Visual Interface between main computer and display device. In this case the main computer uses minimized graphic functions for safety display. The display design is at the conceptual phase, and there are several open areas to be concreted for a solid system. The main purpose of this paper is to describe and suggest a methodology to develop a safety-critical display system and the descriptions are focused on the safety requirement point of view.

  3. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  4. Consequences of the electronic tuning of latent ruthenium-based olefin metathesis catalysts on their reactivity

    KAUST Repository

    Żukowska, Karolina

    2015-08-20

    Two ruthenium olefin metathesis initiators featuring electronically modified quinoline-based chelating carbene ligands are introduced. Their reactivity in RCM and ROMP reactions was tested and the results were compared to those obtained with the parent unsubstituted compound. The studied complexes are very stable at high temperatures up to 140 °C. The placement of an electron-withdrawing functionality translates into an enhanced activity in RCM. While electronically modified precatalysts, which exist predominantly in the trans-dichloro configuration, gave mostly the RCM and a minor amount of the cycloisomerization product, the unmodified congener, which preferentially exists as its cis-dichloro isomer, shows a switched reactivity. The position of the equilibrium between the cis- and the trans-dichloro species was found to be the crucial factor governing the reactivity of the complexes.

  5. Math anxiety: A review of its cognitive consequences, psychophysiological correlates, and brain bases.

    Science.gov (United States)

    Suárez-Pellicioni, Macarena; Núñez-Peña, María Isabel; Colomé, Àngels

    2016-02-01

    A decade has passed since the last published review of math anxiety, which was carried out by Ashcraft and Ridley (2005). Given the considerable interest aroused by this topic in recent years and the growing number of publications related to it, the present article aims to provide a full and updated review of the field, ranging from the initial studies of the impact of math anxiety on numerical cognition, to the latest research exploring its electrophysiological correlates and brain bases from a cognitive neuroscience perspective. Finally, this review describes the factors and mechanisms that have been claimed to play a role in the origins and/or maintenance of math anxiety, and it examines in detail the main explanations proposed to account for the negative effects of math anxiety on performance: competition for working memory resources, a deficit in a low-level numerical representation, and inhibition/attentional control deficit.

  6. Consequences of the electronic tuning of latent ruthenium-based olefin metathesis catalysts on their reactivity

    KAUST Repository

    Żukowska, Karolina; Pump, Eva; Pazio, Aleksandra E; Woźniak, Krzysztof; Cavallo, Luigi; Slugovc, Christian

    2015-01-01

    Two ruthenium olefin metathesis initiators featuring electronically modified quinoline-based chelating carbene ligands are introduced. Their reactivity in RCM and ROMP reactions was tested and the results were compared to those obtained with the parent unsubstituted compound. The studied complexes are very stable at high temperatures up to 140 °C. The placement of an electron-withdrawing functionality translates into an enhanced activity in RCM. While electronically modified precatalysts, which exist predominantly in the trans-dichloro configuration, gave mostly the RCM and a minor amount of the cycloisomerization product, the unmodified congener, which preferentially exists as its cis-dichloro isomer, shows a switched reactivity. The position of the equilibrium between the cis- and the trans-dichloro species was found to be the crucial factor governing the reactivity of the complexes.

  7. Structural consequences of diffuse traumatic brain injury: A large deformation tensor-based morphometry study

    Science.gov (United States)

    Kim, Junghoon; Avants, Brian; Patel, Sunil; Whyte, John; Coslett, H. Branch; Pluta, John; Detre, John A.; Gee, James C.

    2008-01-01

    Traumatic brain injury (TBI) is one of the most common causes of long-term disability. Despite the importance of identifying neuropathology in individuals with chronic TBI, methodological challenges posed at the stage of inter-subject image registration have hampered previous voxel-based MRI studies from providing a clear pattern of structural atrophy after TBI. We used a novel symmetric diffeomorphic image normalization method to conduct a tensor-based morphometry (TBM) study of TBI. The key advantage of this method is that it simultaneously estimates an optimal template brain and topology preserving deformations between this template and individual subject brains. Detailed patterns of atrophies are then revealed by statistically contrasting control and subject deformations to the template space. Participants were 29 survivors of TBI and 20 control subjects who were matched in terms of age, gender, education, and ethnicity. Localized volume losses were found most prominently in white matter regions and the subcortical nuclei including the thalamus, the midbrain, the corpus callosum, the mid- and posterior cingulate cortices, and the caudate. Significant voxel-wise volume loss clusters were also detected in the cerebellum and the frontal/temporal neocortices. Volume enlargements were identified largely in ventricular regions. A similar pattern of results was observed in a subgroup analysis where we restricted our analysis to the 17 TBI participants who had no macroscopic focal lesions (total lesion volume> 1.5 cm 3). The current study confirms, extends, and partly challenges previous structural MRI studies in chronic TBI. By demonstrating that a large deformation image registration technique can be successfully combined with TBM to identify TBI-induced diffuse structural changes with greater precision, our approach is expected to increase the sensitivity of future studies examining brain-behavior relationships in the TBI population. PMID:17999940

  8. Learners’ views about cloud computing-based group activities

    Directory of Open Access Journals (Sweden)

    Yildirim Serkan

    2017-01-01

    Full Text Available Thanks to its use independently of time and place during the process of software development and by making it easier to access to information with mobile technologies, cloud based environments attracted the attention of education world and this technology started to be used in various activities. In this study, for programming education, the effects of extracurricular group assignments in cloud based environments on learners were evaluated in terms of group work satisfaction, ease of use and user satisfaction. Within the scope of computer programming education lasting eight weeks, a total of 100 students participated in the study including 34 men and 66 women. Participants were divided into groups of at least three people considering the advantages of cooperative learning in programming education. In this study carried out in both conventional and cloud based environments, between groups factorial design was used as research design. The data collected by questionnaires of opinions of group work were examined with quantitative analysis method. According to the study results extracurricular learning activities as group activity created satisfaction. However, perceptions of easy use of the environment and user satisfaction were partly positive. Despite the similar understandings; male participants were easier to perceive use of cloud computing based environments. Some variables such as class level, satisfaction, computer and internet usage time do not have any effect on satisfaction and perceptions of ease of use. Evening class students stated that they found it easy to use cloud based learning environments and became more satisfied with using these environments besides being happier with group work than daytime students.

  9. Toward a Culture of Consequences: Performance-Based Accountability Systems for Public Services.

    Science.gov (United States)

    Stecher, Brian M; Camm, Frank; Damberg, Cheryl L; Hamilton, Laura S; Mullen, Kathleen J; Nelson, Christopher; Sorensen, Paul; Wachs, Martin; Yoh, Allison; Zellman, Gail L; Leuschner, Kristin J; Camm, Frank; Stecher, Brian M

    2012-01-01

    Performance-based accountability systems (PBASs), which link incentives to measured performance as a means of improving services to the public, have gained popularity. While PBASs can vary widely across sectors, they share three main components: goals, incentives, and measures. Research suggests that PBASs influence provider behaviors, but little is known about PBAS effectiveness at achieving performance goals or about government and agency experiences. This study examines nine PBASs that are drawn from five sectors: child care, education, health care, public health emergency preparedness, and transportation. In the right circumstances, a PBAS can be an effective strategy for improving service delivery. Optimum circumstances include having a widely shared goal, unambiguous observable measures, meaningful incentives for those with control over the relevant inputs and processes, few competing interests, and adequate resources to design, implement, and operate the PBAS. However, these conditions are rarely fully realized, so it is difficult to design and implement PBASs that are uniformly effective. PBASs represent a promising policy option for improving the quality of service-delivery activities in many contexts. The evidence supports continued experimentation with and adoption of this approach in appropriate circumstances. Even so, PBAS design and its prospects for success depend on the context in which it will operate. Also, ongoing system evaluation and monitoring are integral components of a PBAS; they inform refinements that improve system functioning over time. Empirical evidence of the effects of performance-based public management is scarce. This article also describes a framework used to evaluate a PBAS. Such a system identifies individuals or organizations that must change their behavior for the performance of an activity to improve, chooses an implicit or explicit incentive structure to motivate these organizations or individuals to change, and then

  10. Some Consequences of an Analysis of the Kelvin-Clausius Entropy Formulation Based on Traditional Axiomatics

    Science.gov (United States)

    Jesudason, Christopher G.

    2003-09-01

    Recently, there have appeared interesting correctives or challenges [Entropy 1999, 1, 111-147] to the Second law formulations, especially in the interpretation of the Clausius equivalent transformations, closely related in area to extensions of the Clausius principle to irreversible processes [Chem. Phys. Lett. 1988, 143(1), 65-70]. Since the traditional formulations are central to science, a brief analysis of some of these newer theories along traditional lines is attempted, based on well-attested axioms which have formed the basis of equilibrium thermodynamics. It is deduced that the Clausius analysis leading to the law of increasing entropy does not follow from the given axioms but it can be proved that for irreversible transitions, the total entropy change of the system and thermal reservoirs (the "Universe") is not negative, even for the case when the reservoirs are not at the same temperature as the system during heat transfer. On the basis of two new simple theorems and three corollaries derived for the correlation between irreversible and reversible pathways and the traditional axiomatics, it is shown that a sequence of reversible states can never be used to describe a corresponding sequence of irreversible states for at least closed systems, thereby restricting the principle of local equilibrium. It is further shown that some of the newer irreversible entropy forms given exhibit some paradoxical properties relative to the standard axiomatics. It is deduced that any reconciliation between the traditional approach and novel theories lie in creating a well defined set of axioms to which all theoretical developments should attempt to be based on unless proven not be useful, in which case there should be consensus in removing such axioms from theory. Clausius' theory of equivalent transformations do not contradict the traditional understanding of heat- work efficiency. It is concluded that the intuitively derived assumptions over the last two centuries seem to

  11. Mixed effects modeling of proliferation rates in cell-based models: consequence for pharmacogenomics and cancer.

    Directory of Open Access Journals (Sweden)

    Hae Kyung Im

    2012-02-01

    Full Text Available The International HapMap project has made publicly available extensive genotypic data on a number of lymphoblastoid cell lines (LCLs. Building on this resource, many research groups have generated a large amount of phenotypic data on these cell lines to facilitate genetic studies of disease risk or drug response. However, one problem that may reduce the usefulness of these resources is the biological noise inherent to cellular phenotypes. We developed a novel method, termed Mixed Effects Model Averaging (MEM, which pools data from multiple sources and generates an intrinsic cellular growth rate phenotype. This intrinsic growth rate was estimated for each of over 500 HapMap cell lines. We then examined the association of this intrinsic growth rate with gene expression levels and found that almost 30% (2,967 out of 10,748 of the genes tested were significant with FDR less than 10%. We probed further to demonstrate evidence of a genetic effect on intrinsic growth rate by determining a significant enrichment in growth-associated genes among genes targeted by top growth-associated SNPs (as eQTLs. The estimated intrinsic growth rate as well as the strength of the association with genetic variants and gene expression traits are made publicly available through a cell-based pharmacogenomics database, PACdb. This resource should enable researchers to explore the mediating effects of proliferation rate on other phenotypes.

  12. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  13. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  14. The Computer Student Worksheet Based Mathematical Literacy for Statistics

    Science.gov (United States)

    Manoy, J. T.; Indarasati, N. A.

    2018-01-01

    The student worksheet is one of media teaching which is able to improve teaching an activity in the classroom. Indicators in mathematical literacy were included in a student worksheet is able to help the students for applying the concept in daily life. Then, the use of computers in learning can create learning with environment-friendly. This research used developmental research which was Thiagarajan (Four-D) development design. There are 4 stages in the Four-D, define, design, develop, and disseminate. However, this research was finish until the third stage, develop stage. The computer student worksheet based mathematical literacy for statistics executed good quality. This student worksheet is achieving the criteria if able to achieve three aspects, validity, practicality, and effectiveness. The subject in this research was the students at The 1st State Senior High School of Driyorejo, Gresik, grade eleven of The 5th Mathematics and Natural Sciences. The computer student worksheet products based mathematical literacy for statistics executed good quality, while it achieved the aspects for validity, practical, and effectiveness. This student worksheet achieved the validity aspects with an average of 3.79 (94.72%), and practical aspects with an average of 2.85 (71.43%). Besides, it achieved the effectiveness aspects with a percentage of the classical complete students of 94.74% and a percentage of the student positive response of 75%.

  15. Demonstration of optical computing logics based on binary decision diagram.

    Science.gov (United States)

    Lin, Shiyun; Ishikawa, Yasuhiko; Wada, Kazumi

    2012-01-16

    Optical circuits are low power consumption and fast speed alternatives for the current information processing based on transistor circuits. However, because of no transistor function available in optics, the architecture for optical computing should be chosen that optics prefers. One of which is Binary Decision Diagram (BDD), where signal is processed by sending an optical signal from the root through a serial of switching nodes to the leaf (terminal). Speed of optical computing is limited by either transmission time of optical signals from the root to the leaf or switching time of a node. We have designed and experimentally demonstrated 1-bit and 2-bit adders based on the BDD architecture. The switching nodes are silicon ring resonators with a modulation depth of 10 dB and the states are changed by the plasma dispersion effect. The quality, Q of the rings designed is 1500, which allows fast transmission of signal, e.g., 1.3 ps calculated by a photon escaping time. A total processing time is thus analyzed to be ~9 ps for a 2-bit adder and would scales linearly with the number of bit. It is two orders of magnitude faster than the conventional CMOS circuitry, ~ns scale of delay. The presented results show the potential of fast speed optical computing circuits.

  16. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  17. IMPACT OF COMPUTER BASED ONLINE ENTREPRENEURSHIP DISTANCE EDUCATION IN INDIA

    Directory of Open Access Journals (Sweden)

    Bhagwan SHREE RAM

    2012-07-01

    Full Text Available The success of Indian enterprises and professionals in the computer and information technology (CIT domain during the twenty year has been spectacular. Entrepreneurs, bureaucrats and technocrats are now advancing views about how India can ride CIT bandwagon and leapfrog into a knowledge-based economy in the area of entrepreneurship distance education on-line. Isolated instances of remotely located villagers sending and receiving email messages, effective application of mobile communications and surfing the Internet are being promoted as examples of how the nation can achieve this transformation, while vanquishing socio-economic challenges such as illiteracy, high growth of population, poverty, and the digital divide along the way. Likewise, even while a small fraction of the urban population in India has access to computers and the Internet, e-governance is being projected as the way of the future. There is no dearth of fascinating stories about CIT enabled changes, yet there is little discussion about whether such changes are effective and sustainable in the absence of the basic infrastructure that is accessible to the citizens of more advanced economies. When used appropriately, different CITs are said to help expand access to entrepreneurship distance education, strengthen the relevance of education to the increasingly digital workplace, and raise technical and managerial educational quality by, among others, helping make teaching and learning into an engaging, active process connected to real life. This research paper investigates on the impact of computer based online entrepreneurship distance education in India.

  18. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  19. Motivation and engagement in computer-based learning tasks: investigating key contributing factors

    Directory of Open Access Journals (Sweden)

    Michela Ott, Mauro Tavella

    2010-04-01

    Full Text Available This paper, drawing on a research project concerning the educational use of digital mind games with primary school students, aims at giving a contribution to the understanding of which are the main factors influencing student motivation during computer-based learning activities. It puts forward some ideas and experience based reflections, starting by considering digital games that are widely recognized as the most promising ICT tools to enhance student motivation. The project results suggest that student genuine engagement in learning activities is mainly related to the actual possession of the skills and of the cognitive capacities needed to perform the task. In this perspective, cognitive overload should be regarded as one of the main reasons contributing to hinder student motivation and, consequently, should be avoided. Other elements such as game attractiveness and experimental setting constraints resulted to have a lower effect on student motivation.

  20. Fail-safe computer-based plant protection systems

    International Nuclear Information System (INIS)

    Keats, A.B.

    1983-01-01

    A fail-safe mode of operation for computers used in nuclear reactor protection systems was first evolved in the UK for application to a sodium cooled fast reactor. The fail-safe properties of both the hardware and the software were achieved by permanently connecting test signals to some of the multiplexed inputs. This results in an unambiguous data pattern, each time the inputs are sequentially scanned by the multiplexer. The ''test inputs'' simulate transient excursions beyond defined safe limits. The alternating response of the trip algorithms to the ''out-of-limits'' test signals and the normal plant measurements is recognised by hardwired pattern recognition logic external to the computer system. For more general application to plant protection systems, a ''Test Signal Generator'' (TSG) is used to compute and generate test signals derived from prevailing operational conditions. The TSG, from its knowledge of the sensitivity of the trip algorithm to each of the input variables, generates a ''test disturbance'' which is superimposed upon each variable in turn, to simulate a transient excursion beyond the safe limits. The ''tripped'' status yielded by the trip algorithm when using data from a ''disturbed'' input forms part of a pattern determined by the order in which the disturbances are applied to the multiplexer inputs. The data pattern formed by the interleaved test disturbances is again recognised by logic external to the protection system's computers. This fail-safe mode of operation of computer-based protection systems provides a powerful defence against common-mode failure. It also reduces the importance of software verification in the licensing procedure. (author)

  1. Computer based C and I systems in Indian PHWRs

    International Nuclear Information System (INIS)

    Govindarajan, G.; Sharma, M.P.

    1997-01-01

    Benefits of programmable digital technology have been well recognized and employment of computer based systems in Indian PHWRs has evolved in a phased manner, keeping in view the regulatory requirements for their use. In the initial phase some operator information functions and control of on-power fuel handling system were implemented and then some systems performing control and safety functions have been employed. The availability of powerful microcomputer hardware at reasonable cost and indigenous capability in design and execution has encouraged wider use of digital technology in the nuclear power programme. To achieve the desired level of quality and reliability, the hardware modules for the implementation of these systems in the plants under construction, have been standardized and methodology for software verification and validation has been evolved. A large number of C and I functions including those for equipment diagnostics are being implemented. The paper describes the various applications of computers in Indian NPPs and their current status of implementation. (author)

  2. Nanotube devices based crossbar architecture: toward neuromorphic computing

    International Nuclear Information System (INIS)

    Zhao, W S; Gamrat, C; Agnus, G; Derycke, V; Filoramo, A; Bourgoin, J-P

    2010-01-01

    Nanoscale devices such as carbon nanotube and nanowires based transistors, memristors and molecular devices are expected to play an important role in the development of new computing architectures. While their size represents a decisive advantage in terms of integration density, it also raises the critical question of how to efficiently address large numbers of densely integrated nanodevices without the need for complex multi-layer interconnection topologies similar to those used in CMOS technology. Two-terminal programmable devices in crossbar geometry seem particularly attractive, but suffer from severe addressing difficulties due to cross-talk, which implies complex programming procedures. Three-terminal devices can be easily addressed individually, but with limited gain in terms of interconnect integration. We show how optically gated carbon nanotube devices enable efficient individual addressing when arranged in a crossbar geometry with shared gate electrodes. This topology is particularly well suited for parallel programming or learning in the context of neuromorphic computing architectures.

  3. A Computer- Based Digital Signal Processing for Nuclear Scintillator Detectors

    International Nuclear Information System (INIS)

    Ashour, M.A.; Abo Shosha, A.M.

    2000-01-01

    In this paper, a Digital Signal Processing (DSP) Computer-based system for the nuclear scintillation signals with exponential decay is presented. The main objective of this work is to identify the characteristics of the acquired signals smoothly, this can be done by transferring the signal environment from random signal domain to deterministic domain using digital manipulation techniques. The proposed system consists of two major parts. The first part is the high performance data acquisition system (DAQ) that depends on a multi-channel Logic Scope. Which is interfaced with the host computer through the General Purpose Interface Board (GPIB) Ver. IEEE 488.2. Also, a Graphical User Interface (GUI) has been designed for this purpose using the graphical programming facilities. The second of the system is the DSP software Algorithm which analyses, demonstrates, monitoring these data to obtain the main characteristics of the acquired signals; the amplitude, the pulse count, the pulse width, decay factor, and the arrival time

  4. Monitoring system of multiple fire fighting based on computer vision

    Science.gov (United States)

    Li, Jinlong; Wang, Li; Gao, Xiaorong; Wang, Zeyong; Zhao, Quanke

    2010-10-01

    With the high demand of fire control in spacious buildings, computer vision is playing a more and more important role. This paper presents a new monitoring system of multiple fire fighting based on computer vision and color detection. This system can adjust to the fire position and then extinguish the fire by itself. In this paper, the system structure, working principle, fire orientation, hydrant's angle adjusting and system calibration are described in detail; also the design of relevant hardware and software is introduced. At the same time, the principle and process of color detection and image processing are given as well. The system runs well in the test, and it has high reliability, low cost, and easy nodeexpanding, which has a bright prospect of application and popularization.

  5. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-11-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  6. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-03-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  7. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  8. A Nuclear Safety System based on Industrial Computer

    International Nuclear Information System (INIS)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack

    2011-01-01

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  9. A Nuclear Safety System based on Industrial Computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Hyeon; Oh, Do Young; Lee, Nam Hoon; Kim, Chang Ho; Kim, Jae Hack [Korea Electric Power Corporation Engineering and Construction, Daejeon (Korea, Republic of)

    2011-05-15

    The Plant Protection System(PPS), a nuclear safety Instrumentation and Control (I and C) system for Nuclear Power Plants(NPPs), generates reactor trip on abnormal reactor condition. The Core Protection Calculator System (CPCS) is a safety system that generates and transmits the channel trip signal to the PPS on an abnormal condition. Currently, these systems are designed on the Programmable Logic Controller(PLC) based system and it is necessary to consider a new system platform to adapt simpler system configuration and improved software development process. The CPCS was the first implementation using a micro computer in a nuclear power plant safety protection system in 1980 which have been deployed in Ulchin units 3,4,5,6 and Younggwang units 3,4,5,6. The CPCS software was developed in the Concurrent Micro5 minicomputer using assembly language and embedded into the Concurrent 3205 computer. Following the micro computer based CPCS, PLC based Common-Q platform has been used for the ShinKori/ShinWolsong units 1,2 PPS and CPCS, and the POSAFE-Q PLC platform is used for the ShinUlchin units 1,2 PPS and CPCS. In developing the next generation safety system platform, several factors (e.g., hardware/software reliability, flexibility, licensibility and industrial support) can be considered. This paper suggests an Industrial Computer(IC) based protection system that can be developed with improved flexibility without losing system reliability. The IC based system has the advantage of a simple system configuration with optimized processor boards because of improved processor performance and unlimited interoperability between the target system and development system that use commercial CASE tools. This paper presents the background to selecting the IC based system with a case study design of the CPCS. Eventually, this kind of platform can be used for nuclear power plant safety systems like the PPS, CPCS, Qualified Indication and Alarm . Pami(QIAS-P), and Engineering Safety

  10. Analysis of the effect of mobile phone base station antenna loading on localized SAR and its consequences for measurements.

    Science.gov (United States)

    Hansson, Björn; Thors, Björn; Törnevik, Christer

    2011-12-01

    In this work, the effect of antenna element loading on the localized specific absorption rate (SAR) has been analyzed for base station antennas. The analysis was conducted in order to determine whether localized SAR measurements of large multi-element base station antennas can be conducted using standardized procedures and commercially available equipment. More specifically, it was investigated if the antenna shifting measurement procedure, specified in the European base station exposure assessment standard EN 50383, will produce accurate localized SAR results for base station antennas larger than the specified measurement phantom. The obtained results show that SAR accuracy is affected by the presence of lossy material within distances of one wavelength from the tested antennas as a consequence of coupling and redistribution of transmitted power among the antenna elements. It was also found that the existing standardized phantom is not optimal for SAR measurements of large base station antennas. A new methodology is instead proposed based on a larger, box-shaped, whole-body phantom. Copyright © 2011 Wiley Periodicals, Inc.

  11. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  12. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  13. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  14. Resistive content addressable memory based in-memory computation architecture

    KAUST Repository

    Salama, Khaled N.; Zidan, Mohammed A.; Kurdahi, Fadi; Eltawil, Ahmed M.

    2016-01-01

    Various examples are provided examples related to resistive content addressable memory (RCAM) based in-memory computation architectures. In one example, a system includes a content addressable memory (CAM) including an array of cells having a memristor based crossbar and an interconnection switch matrix having a gateless memristor array, which is coupled to an output of the CAM. In another example, a method, includes comparing activated bit values stored a key register with corresponding bit values in a row of a CAM, setting a tag bit value to indicate that the activated bit values match the corresponding bit values, and writing masked key bit values to corresponding bit locations in the row of the CAM based on the tag bit value.

  15. Resistive content addressable memory based in-memory computation architecture

    KAUST Repository

    Salama, Khaled N.

    2016-12-08

    Various examples are provided examples related to resistive content addressable memory (RCAM) based in-memory computation architectures. In one example, a system includes a content addressable memory (CAM) including an array of cells having a memristor based crossbar and an interconnection switch matrix having a gateless memristor array, which is coupled to an output of the CAM. In another example, a method, includes comparing activated bit values stored a key register with corresponding bit values in a row of a CAM, setting a tag bit value to indicate that the activated bit values match the corresponding bit values, and writing masked key bit values to corresponding bit locations in the row of the CAM based on the tag bit value.

  16. Preliminary Study on Hybrid Computational Phantom for Radiation Dosimetry Based on Subdivision Surface

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Choi, Sang Hyoun; Cho, Sung Koo; Kim, Chan Hyeong

    2007-01-01

    The anthropomorphic computational phantoms are classified into two groups. One group is the stylized phantoms, or MIRD phantoms, which are based on mathematical representations of the anatomical structures. The shapes and positions of the organs and tissues in these phantoms can be adjusted by changing the coefficients of the equations in use. The other group is the voxel phantoms, which are based on tomographic images of a real person such as CT, MR and serially sectioned color slice images from a cadaver. Obviously, the voxel phantoms represent the anatomical structures of a human body much more realistically than the stylized phantoms. A realistic representation of anatomical structure is very important for an accurate calculation of radiation dose in the human body. Consequently, the ICRP recently has decided to use the voxel phantoms for the forthcoming update of the dose conversion coefficients. However, the voxel phantoms also have some limitations: (1) The topology and dimensions of the organs and tissues in a voxel model are extremely difficult to change, and (2) The thin organs, such as oral mucosa and skin, cannot be realistically modeled unless the voxel resolution is prohibitively high. Recently, a new approach has been implemented by several investigators. The investigators converted their voxel phantoms to hybrid computational phantoms based on NURBS (Non-Uniform Rational B-Splines) surface, which is smooth and deformable. It is claimed that these new phantoms have the flexibility of the stylized phantom along with the realistic representations of the anatomical structures. The topology and dimensions of the anatomical structures can be easily changed as necessary. Thin organs can be modeled without affecting computational speed or memory requirement. The hybrid phantoms can be also used for 4-D Monte Carlo simulations. In this preliminary study, the external shape of a voxel phantom (i.e., skin), HDRK-Man, was converted to a hybrid computational

  17. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  18. Standardized Computer-based Organized Reporting of EEG: SCORE

    Science.gov (United States)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C; Fuglsang-Frederiksen, Anders; Martins-da-Silva, António; Trinka, Eugen; Visser, Gerhard; Rubboli, Guido; Hjalgrim, Helle; Stefan, Hermann; Rosén, Ingmar; Zarubova, Jana; Dobesberger, Judith; Alving, Jørgen; Andersen, Kjeld V; Fabricius, Martin; Atkins, Mary D; Neufeld, Miri; Plouin, Perrine; Marusic, Petr; Pressler, Ronit; Mameniskiene, Ruta; Hopfengärtner, Rüdiger; Emde Boas, Walter; Wolf, Peter

    2013-01-01

    The electroencephalography (EEG) signal has a high complexity, and the process of extracting clinically relevant features is achieved by visual analysis of the recordings. The interobserver agreement in EEG interpretation is only moderate. This is partly due to the method of reporting the findings in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings). A working group of EEG experts took part in consensus workshops in Dianalund, Denmark, in 2010 and 2011. The faculty was approved by the Commission on European Affairs of the International League Against Epilepsy (ILAE). The working group produced a consensus proposal that went through a pan-European review process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice. The main elements of SCORE are the following: personal data of the patient, referral data, recording conditions, modulators, background activity, drowsiness and sleep, interictal findings, “episodes” (clinical or subclinical events), physiologic patterns, patterns of uncertain significance, artifacts, polygraphic channels, and diagnostic significance. The following specific aspects of the neonatal EEGs are scored: alertness, temporal organization, and spatial organization. For each EEG finding, relevant features are scored using predefined terms. Definitions are provided for all EEG terms and features. SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make

  19. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  20. Computer-based systems important to safety (COMPSIS) - Reporting guidelines

    International Nuclear Information System (INIS)

    1999-07-01

    The objective of this procedure is to help the user to prepare an COMPSIS report on an event so that important lessons learned are most efficiently transferred to the database. This procedure focuses on the content of the information to be provided in the report rather than on its format. The established procedure follows to large extend the procedure chosen by the IRS incident reporting system. However this database is built for I and C equipment with the purpose of the event report database to collect and disseminate information on events of significance involving Computer-Based Systems important to safety in nuclear power plants, and feedback conclusions and lessons learnt from such events. For events where human performance is dominant to draw lessons, more detailed guidance on the specific information that should be supplied is spelled out in the present procedure. This guidance differs somewhat from that for the provision of technical information, and takes into account that the engineering world is usually less familiar with human behavioural analysis than with technical analysis. The events to be reported to the COMPSIS database should be based on the national reporting criteria in the participating member countries. The aim is that all reports including computer based systems that meet each country reporting criteria should be reported. The database should give a broad picture of events/incidents occurring in operation with computer control systems. As soon as an event has been identified, the insights and lessons learnt to be conveyed to the international nuclear community shall be clearly identified. On the basis of the description of the event, the event shall be analyzed in detail under the aspect of direct and potential impact to plant safety functions. The first part should show the common involvement of operation and safety systems and the second part should show the special aspects of I and C functions, hardware and software

  1. Computerbasiert prüfen [Computer-based Assessment

    Directory of Open Access Journals (Sweden)

    Frey, Peter

    2006-08-01

    Full Text Available [english] Computer-based testing in medical education offers new perspectives. Advantages are sequential or adaptive testing, integration of movies or sound, rapid feedback to candidates and management of web-based question banks. Computer-based testing can also be implemented in an OSCE examination. In e-learning environments formative self-assessment are often implemented and gives helpful feedbacks to learners. Disadvantages in high-stake exams are the high requirements as well for the quality of testing (e.g. standard setting as additionally for the information technology and especially for security. [german] Computerbasierte Prüfungen im Medizinstudium eröffnen neue Möglichkeiten. Vorteile solcher Prüfungen liegen im sequentiellen oder adaptiven Prüfen, in der Integration von Bewegtbildern oder Ton, der raschen Auswertung und zentraler Verwaltung der Prüfungsfragen via Internet. Ein Einsatzgebiet mit vertretbarem Aufwand sind Prüfungen mit mehreren Stationen wie beispielsweise die OSCE-Prüfung. Computerbasierte formative Selbsttests werden im Bereiche e-learning häufig angeboten. Das hilft den Lernenden ihren Wissensstand besser einzuschätzen oder sich mit den Leistungen anderer zu vergleichen. Grenzen zeigen sich bei den summativen Prüfungen beim Prüfungsort, da zuhause Betrug möglich ist. Höhere ärztliche Kompetenzen wie Untersuchungstechnik oder Kommunikation eigenen sich kaum für rechnergestützte Prüfungen.

  2. and consequences

    Directory of Open Access Journals (Sweden)

    P. Athanasopoulou

    2011-01-01

    Full Text Available (a Purpose: The purpose of this research is to identify the types of CSR initiatives employed by sports organisations; their antecedents, and their consequences for the company and society. (b Design/methodology/approach: This study is exploratory in nature. Two detailed case studies were conducted involving the football team and the basketball team of one professional, premier league club in Greece and their CSR initiatives. Both teams have the same name, they belong to one of the most popular teams in Greece with a large fan population; have both competed in International Competitions (UEFA’s Champion League; Final Four of the European Tournament and have realised many CSR initiatives in the past. The case studies involved in depth, personal interviews of managers responsible for CSR in each team. Case study data was triangulated with documentation and search of published material concerning CSR actions. Data was analysed with content analysis. (c Findings: Both teams investigated have undertaken various CSR activities the last 5 years, the football team significantly more than the basketball team. Major factors that affect CSR activity include pressure from leagues; sponsors; local community, and global organisations; orientation towards fulfilling their duty to society, and team CSR strategy. Major benefits from CSR include relief of vulnerable groups and philanthropy as well as a better reputation for the firm; increase in fan base; and finding sponsors more easily due to the social profile of the team. However, those benefits are not measured in any way although both teams observe increase in tickets sold; web site traffic and TV viewing statistics after CSR activities. Finally, promotion of CSR is mainly done through web sites; press releases; newspapers, and word-of-mouth communications. (d Research limitations/implications: This study involves only two case studies and has limited generalisability. Future research can extend the

  3. Consequence analysis

    International Nuclear Information System (INIS)

    Woodard, K.

    1985-01-01

    The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties

  4. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  5. Smart Learning Services Based on Smart Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yong-Ik Yoon

    2011-08-01

    Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  6. [Problem list in computer-based patient records].

    Science.gov (United States)

    Ludwig, C A

    1997-01-14

    Computer-based clinical information systems are capable of effectively processing even large amounts of patient-related data. However, physicians depend on rapid access to summarized, clearly laid out data on the computer screen to inform themselves about a patient's current clinical situation. In introducing a clinical workplace system, we therefore transformed the problem list-which for decades has been successfully used in clinical information management-into an electronic equivalent and integrated it into the medical record. The table contains a concise overview of diagnoses and problems as well as related findings. Graphical information can also be integrated into the table, and an additional space is provided for a summary of planned examinations or interventions. The digital form of the problem list makes it possible to use the entire list or selected text elements for generating medical documents. Diagnostic terms for medical reports are transferred automatically to corresponding documents. Computer technology has an immense potential for the further development of problem list concepts. With multimedia applications sound and images will be included in the problem list. For hyperlink purpose the problem list could become a central information board and table of contents of the medical record, thus serving as the starting point for database searches and supporting the user in navigating through the medical record.

  7. The extended RBAC model based on grid computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian-gang; WANG Ru-chuan; WANG Hai-yan

    2006-01-01

    This article proposes the extended role-based access control (RBAC) model for solving dynamic and multidomain problems in grid computing, The formulated description of the model has been provided. The introduction of context and the mapping relations of context-to-role and context-to-permission help the model adapt to dynamic property in grid environment.The multidomain role inheritance relation by the authorization agent service realizes the multidomain authorization amongst the autonomy domain. A function has been proposed for solving the role inheritance conflict during the establishment of the multidomain role inheritance relation.

  8. Computer-Assisted Search Of Large Textual Data Bases

    Science.gov (United States)

    Driscoll, James R.

    1995-01-01

    "QA" denotes high-speed computer system for searching diverse collections of documents including (but not limited to) technical reference manuals, legal documents, medical documents, news releases, and patents. Incorporates previously available and emerging information-retrieval technology to help user intelligently and rapidly locate information found in large textual data bases. Technology includes provision for inquiries in natural language; statistical ranking of retrieved information; artificial-intelligence implementation of semantics, in which "surface level" knowledge found in text used to improve ranking of retrieved information; and relevance feedback, in which user's judgements of relevance of some retrieved documents used automatically to modify search for further information.

  9. Computer-based information management system for interventional radiology

    International Nuclear Information System (INIS)

    Forman, B.H.; Silverman, S.G.; Mueller, P.R.; Hahn, P.F.; Papanicolaou, N.; Tung, G.A.; Brink, J.A.; Ferrucci, J.T.

    1989-01-01

    The authors authored and implemented a computer-based information management system (CBIMS) for the integrated analysis of data from a variety of abdominal nonvascular interventional procedures. The CBIMS improved on their initial handwritten-card system (which listed only patient name, hospital number, and type of procedure) by capturing relevant patient data in an organized fashion and integrating information for meaningful analysis. Advantages of CBIMS include enhanced compilation of monthly census, easy access to a patient's interventional history, and flexible querying capability that allows easy extraction of subsets of information from the patient database

  10. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  11. INFORMATION DISPLAY: CONSIDERATIONS FOR DESIGNING COMPUTER-BASED DISPLAY SYSTEMS

    International Nuclear Information System (INIS)

    O'HARA, J.M.; PIRUS, D.; BELTRATCCHI, L.

    2004-01-01

    This paper discussed the presentation of information in computer-based control rooms. Issues associated with the typical displays currently in use are discussed. It is concluded that these displays should be augmented with new displays designed to better meet the information needs of plant personnel and to minimize the need for interface management tasks (the activities personnel have to do to access and organize the information they need). Several approaches to information design are discussed, specifically addressing: (1) monitoring, detection, and situation assessment; (2) routine task performance; and (3) teamwork, crew coordination, collaborative work

  12. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    Science.gov (United States)

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  13. 3-D computer graphics based on integral photography.

    Science.gov (United States)

    Naemura, T; Yoshida, T; Harashima, H

    2001-02-12

    Integral photography (IP), which is one of the ideal 3-D photographic technologies, can be regarded as a method of capturing and displaying light rays passing through a plane. The NHK Science and Technical Research Laboratories have developed a real-time IP system using an HDTV camera and an optical fiber array. In this paper, the authors propose a method of synthesizing arbitrary views from IP images captured by the HDTV camera. This is a kind of image-based rendering system, founded on the 4-D data space Representation of light rays. Experimental results show the potential to improve the quality of images rendered by computer graphics techniques.

  14. Choices and Consequences.

    Science.gov (United States)

    Thorp, Carmany

    1995-01-01

    Describes student use of Hyperstudio computer software to create history adventure games. History came alive while students learned efficient writing skills; learned to understand and manipulate cause, effect choice and consequence; and learned to incorporate succinct locational, climatic, and historical detail. (ET)

  15. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    OpenAIRE

    Jingyu Wang; xuefeng Zheng; Dengliang Luo

    2011-01-01

    Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in...

  16. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  17. ARAC: a computer-based emergency dose-assessment service

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1990-01-01

    Over the past 15 years, the Lawrence Livermore National Laboratory's Atmospheric Release Advisory Capability (ARAC) has developed and evolved a computer-based, real-time, radiological-dose-assessment service for the United States Departments of Energy and Defense. This service is built on the integrated components of real-time computer-acquired meteorological data, extensive computer databases, numerical atmospheric-dispersion models, graphical displays, and operational-assessment-staff expertise. The focus of ARAC is the off-site problem where regional meteorology and topography are dominant influences on transport and dispersion. Through application to numerous radiological accidents/releases on scales from small accidental ventings to the Chernobyl reactor disaster, ARAC has developed methods to provide emergency dose assessments from the local to the hemispheric scale. As the power of computers has evolved inversely with respect to cost and size, ARAC has expanded its service and reduced the response time from hours to minutes for an accident within the United States. Concurrently the quality of the assessments has improved as more advanced models have been developed and incorporated into the ARAC system. Over the past six years, the number of directly connected facilities has increased from 6 to 73. All major U.S. Federal agencies now have access to ARAC via the Department of Energy. This assures a level of consistency as well as experience. ARAC maintains its real-time skills by participation in approximately 150 exercises per year; ARAC also continuously validates its modeling systems by application to all available tracer experiments and data sets

  18. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    Science.gov (United States)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any

  19. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  20. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  1. The interaction between felt touch and tactile consequences of observed actions: an action-based somatosensory congruency paradigm.

    Science.gov (United States)

    Deschrijver, Eliane; Wiersema, Jan R; Brass, Marcel

    2016-07-01

    Action observation leads to a representation of both the motor aspect of an observed action (motor simulation) and its somatosensory consequences (action-based somatosensory simulation) in the observer's brain. In the current electroencephalography-study, we investigated the neuronal interplay of action-based somatosensory simulation and felt touch. We presented index or middle finger tapping movements of a human or a wooden hand, while simultaneously presenting 'tap-like' tactile sensations to either the corresponding or non-corresponding fingertip of the participant. We focused on an early stage of somatosensory processing [P50, N100 and N140 sensory evoked potentials (SEPs)] and on a later stage of higher-order processing (P3-complex). The results revealed an interaction effect of animacy and congruency in the early P50 SEP and an animacy effect in the N100/N140 SEPs. In the P3-complex, we found an interaction effect indicating that the influence of congruency was larger in the human than in the wooden hand. We argue that the P3-complex may reflect higher-order self-other distinction by signaling simulated action-based touch that does not match own tactile information. As such, the action-based somatosensory congruency paradigm might help understand higher-order social processes from a somatosensory point of view. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  2. Actual directions in study of ecological consequences of a highly toxic 1,1-dimethylhydrazine-based rocket fuel spills

    Directory of Open Access Journals (Sweden)

    Bulat Kenessov

    2012-05-01

    Full Text Available The paper represents a review of the actual directions in study of ecological consequences of highly toxic 1,1-dimethylhydrazine-based rocket fuel spills. Recent results on study of processes of transformation of 1,1-dimethylhydrazine, identification of its main metabolites and development of analytical methods for their determination are generalized. Modern analytical methods of determination of 1,1-dimethylhydrazine and its transformation products in environmental samples are characterized. It is shown that in recent years, through the use of most modern methods of physical chemical analysis and sample preparation, works in this direction made significant progress and contributed to the development of studies in adjacent areas. A character of distribution of transformation products in soils of fall places of first stages of rocket-carriers is described and the available methods for their remediation are characterized.

  3. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  4. Paying for hospital-based care of Kala-azar in Nepal: assessing catastrophic, impoverishment and economic consequences.

    Science.gov (United States)

    Adhikari, Shiva R; Maskay, Nephil M; Sharma, Bishnu P

    2009-03-01

    Households obtaining health care services in developing countries incur substantial costs, despite services generally being provided free of charge by public health institutions. This constitutes an economic burden on low-income households, and contributes to deepening their level of poverty. In addition to the economic burden of obtaining health care, the method of financing these payments has implications for the distribution of household assets. This effect on resource-poor households is amplified since they have decreased access to health insurance. Recent literature, however, ignores the importance of the method of financing health care payments. This paper looks at the case of Nepal and highlights the impact on households of paying for hospital-based care of Kala-azar (KA) by analysing the catastrophic, impoverishment and economic consequences of their coping strategies. The paper utilizes micro-data on a random selection of 50% of the KA-affected households of Siraha and Saptari districts of Nepal. The empirical results suggest that direct costs of hospital-based treatment of KA are catastrophic since they consume 17% of annual household income. This expenditure causes more than 20% of KA-affected households to fall below the poverty line, with the remaining households being pushed into the category of marginal poor; the poverty gap ratio is more than 90%. Further, KA incidence can have prolonged and severe economic consequences for the household economy due to the mechanisms of informal sector financing to which households resort. A heavy burden of loan repayments can lead households on a downward spiral that eventually becomes a poverty trap. In other words, the method of financing health care payments is an important ingredient in understanding the economic burden of disease.

  5. Many-core computing for space-based stereoscopic imaging

    Science.gov (United States)

    McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry

    The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.

  6. Computer-based systems for nuclear power stations

    International Nuclear Information System (INIS)

    Humble, P.J.; Welbourne, D.; Belcher, G.

    1995-01-01

    The published intentions of vendors are for extensive touch-screen control and computer-based protection. The software features needed for acceptance in the UK are indicated. The defence in depth needed is analyzed. Current practice in aircraft flight control systems and the software methods available are discussed. Software partitioning and mathematically formal methods are appropriate for the structures and simple logic needed for nuclear power applications. The potential for claims of diversity and independence between two computer-based subsystems of a protection system is discussed. Features needed to meet a single failure criterion applied to software are discussed. Conclusions are given on the main factors which a design should allow for. The work reported was done for the Health and Safety Executive of the UK (HSE), and acknowledgement is given to them, to NNC Ltd and to GEC-Marconi Avionics Ltd for permission to publish. The opinions and recommendations expressed are those of the authors and do not necessarily reflect those of HSE. (Author)

  7. A cloud computing based 12-lead ECG telemedicine service

    Science.gov (United States)

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  8. Computer Vision Based Measurement of Wildfire Smoke Dynamics

    Directory of Open Access Journals (Sweden)

    BUGARIC, M.

    2015-02-01

    Full Text Available This article presents a novel method for measurement of wildfire smoke dynamics based on computer vision and augmented reality techniques. The aspect of smoke dynamics is an important feature in video smoke detection that could distinguish smoke from visually similar phenomena. However, most of the existing smoke detection systems are not capable of measuring the real-world size of the detected smoke regions. Using computer vision and GIS-based augmented reality, we measure the real dimensions of smoke plumes, and observe the change in size over time. The measurements are performed on offline video data with known camera parameters and location. The observed data is analyzed in order to create a classifier that could be used to eliminate certain categories of false alarms induced by phenomena with different dynamics than smoke. We carried out an offline evaluation where we measured the improvement in the detection process achieved using the proposed smoke dynamics characteristics. The results show a significant increase in algorithm performance, especially in terms of reducing false alarms rate. From this it follows that the proposed method for measurement of smoke dynamics could be used to improve existing smoke detection algorithms, or taken into account when designing new ones.

  9. A cloud computing based 12-lead ECG telemedicine service.

    Science.gov (United States)

    Hsieh, Jui-Chien; Hsu, Meng-Wei

    2012-07-28

    Due to the great variability of 12-lead ECG instruments and medical specialists' interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists' decision making support in emergency telecardiology. We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  10. A cloud computing based 12-lead ECG telemedicine service

    Directory of Open Access Journals (Sweden)

    Hsieh Jui-chien

    2012-07-01

    Full Text Available Abstract Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  11. Applying computer-based procedures in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Mauro V. de; Carvalho, Paulo V.R. de; Santos, Isaac J.A.L. dos; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Div. de Instrumentacao e Confiabilidade Humana], e-mail: mvitor@ien.gov.br, e-mail: paulov@ien.gov.br, e-mail: luquetti@ien.gov.br, e-mail: grecco@ien.gov.br; Bruno, Diego S. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Escola Politecnica. Curso de Engenharia de Controle e Automacao], e-mail: diegosalomonebruno@gmail.com

    2009-07-01

    Plant operation procedures are used to guide operators in coping with normal, abnormal or emergency situations in a process control system. Historically, the plant procedures have been paper-based (PBP), with the digitalisation trend in these complex systems computer-based procedures (CBPs) are being developed to support procedure use. This work shows briefly the research on CBPs at the Human-System Interface Laboratory (LABIHS). The emergency operation procedure EOP-0 of the LABIHS NPP simulator was implemented in the ImPRO CBP system. The ImPRO system was chosen for test because it is available for download in the Internet. A preliminary operation test using the implemented procedure in the CBP system was realized and the results were compared to the operation through PBP use. (author)

  12. A personal computer based console monitor for a TRIGA reactor

    International Nuclear Information System (INIS)

    Rieke, Phillip E.; Hood, William E.; Razvi, Junaid

    1990-01-01

    Numerous improvements have been made to the Mark F facility to provide a minimum reactor down time, giving a high reactor availability. A program was undertaken to enhance the monitoring capabilities of the instrumentation and control system on this reactor. To that end, a personal computer based console monitoring system has been developed, installed in the control room and is operational to provide real-time monitoring and display of a variety of reactor operating parameters. This system is based on commercially available hardware and an applications software package developed internally at the GA facility. It has (a) assisted the operator in controlling reactor parameters to maintain the high degree of power stability required during extended runs with thermionic devices in-core, and (b) provided data trending and archiving capabilities on all monitored channels to allow a post-mortem analysis to be performed on any of the monitored parameters

  13. [Veneer computer aided design based on reverse engineering technology].

    Science.gov (United States)

    Liu, Ming-li; Chen, Xiao-dong; Wang, Yong

    2012-03-01

    To explore the computer aided design (CAD) method of veneer restoration, and to assess if the solution can help prosthesis meet morphology esthetics standard. A volunteer's upper right central incisor needed to be restored with veneer. Super hard stone models of patient's dentition (before and after tooth preparation) were scanned with the three-dimensional laser scanner. The veneer margin was designed as butt-to-butt type. The veneer was constructed using reverse engineering (RE) software. The technique guideline of veneers CAD was explore based on RE software, and the veneers was smooth, continuous and symmetrical, which met esthetics construction needs. It was a feasible method to reconstruct veneer restoration based on RE technology.

  14. Computer aided fixture design - A case based approach

    Science.gov (United States)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  15. Knowledge-Based Systems in Biomedicine and Computational Life Science

    CERN Document Server

    Jain, Lakhmi

    2013-01-01

    This book presents a sample of research on knowledge-based systems in biomedicine and computational life science. The contributions include: ·         personalized stress diagnosis system ·         image analysis system for breast cancer diagnosis ·         analysis of neuronal cell images ·         structure prediction of protein ·         relationship between two mental disorders ·         detection of cardiac abnormalities ·         holistic medicine based treatment ·         analysis of life-science data  

  16. A wireless computational platform for distributed computing based traffic monitoring involving mixed Eulerian-Lagrangian sensing

    KAUST Repository

    Jiang, Jiming

    2013-06-01

    This paper presents a new wireless platform designed for an integrated traffic monitoring system based on combined Lagrangian (mobile) and Eulerian (fixed) sensing. The sensor platform is built around a 32-bit ARM Cortex M4 micro-controller and a 2.4GHz 802.15.4 ISM compliant radio module, and can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. The platform is specially designed and optimized to be integrated in a solar-powered wireless sensor network in which traffic flow maps are computed by the nodes directly using distributed computing. A MPPT circuitry is proposed to increase the power output of the attached solar panel. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debug. An ongoing implementation is briefly discussed, and compared with existing platforms used in wireless sensor networks. © 2013 IEEE.

  17. Description of NORMTRI: a computer program for assessing the off-site consequences from air-borne releases of tritium during normal operation of nuclear facilities

    International Nuclear Information System (INIS)

    Raskob, W.

    1994-10-01

    The computer program NORMTRI has been developed to calculate the behaviour of tritium in the environment released into the atmosphere under normal operation of nuclear facilities. It is possible to investigate the two chemical forms tritium gas and tritiated water vapour. The conversion of tritium gas into tritiated water followed by its reemission back to the atmosphere as well as the conversion into organically bound tritium is considered. NORMTRI is based on the statistical Gaussian dispersion model ISOLA, which calculates the activity concentration in air near the ground contamination due to dry and wet deposition at specified locations in a polar grid system. ISOLA requires a four-parametric meteorological statistics derived from one or more years synoptic recordings of 1-hour-averages of wind speed, wind direction, stability class and precipitation intensity. Additional features of NORMTRI are the possibility to choose several dose calculation procedures, ranging from the equations of the German regulatory guidelines to a pure specific equilibrium approach. (orig.)

  18. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  19. Convincing Conversations : Using a Computer-Based Dialogue System to Promote a Plant-Based Diet

    NARCIS (Netherlands)

    Zaal, Emma; Mills, Gregory; Hagen, Afke; Huisman, Carlijn; Hoeks, Jacobus

    2017-01-01

    In this study, we tested the effectiveness of a computer-based persuasive dialogue system designed to promote a plant-based diet. The production and consumption of meat and dairy has been shown to be a major cause of climate change and a threat to public health, bio-diversity, animal rights and

  20. Region based Brain Computer Interface for a home control application.

    Science.gov (United States)

    Akman Aydin, Eda; Bay, Omer Faruk; Guler, Inan

    2015-08-01

    Environment control is one of the important challenges for disabled people who suffer from neuromuscular diseases. Brain Computer Interface (BCI) provides a communication channel between the human brain and the environment without requiring any muscular activation. The most important expectation for a home control application is high accuracy and reliable control. Region-based paradigm is a stimulus paradigm based on oddball principle and requires selection of a target at two levels. This paper presents an application of region based paradigm for a smart home control application for people with neuromuscular diseases. In this study, a region based stimulus interface containing 49 commands was designed. Five non-disabled subjects were attended to the experiments. Offline analysis results of the experiments yielded 95% accuracy for five flashes. This result showed that region based paradigm can be used to select commands of a smart home control application with high accuracy in the low number of repetitions successfully. Furthermore, a statistically significant difference was not observed between the level accuracies.

  1. Is HIV/AIDS a consequence or divine judgment? Implications for faith-based social services. A Nigerian faith-based university's study.

    Science.gov (United States)

    Olaore, Israel B; Olaore, Augusta Y

    2014-01-01

    A contemporary reading of Romans 1:27 was disguised as a saying by Paul Benjamin, AD 58 and administered to 275 randomly selected members of a private Christian university community in south western Nigeria in West Africa. Participants were asked to respond to a two-item questionnaire on their perception of the cause of HIV/AIDS either as a judgment from God or consequence of individual lifestyle choices. The apparent consensus drifted in the direction of God as the culprit handing down his judgment to perpetrators of evil who engage in the homosexual lifestyle. The goal of this paper was to examine the implications of a judgmental stance on addressing the psychosocial needs of Persons Living with HIV/AIDS in religious environments. It also explores how service providers in faith-based environments can work around the Judgment versus Consequence tussle in providing non-discriminatory services to persons diagnosed with HIV/AIDS.

  2. Telemedicine Based on Mobile Devices and Mobile Cloud Computing

    OpenAIRE

    Lidong Wang; Cheryl Ann Alexander

    2014-01-01

    Mobile devices such as smartphones and tablets support kinds of mobile computing and services. They can access to the cloud or offload the computation-intensive part to the cloud computing resources. Mobile cloud computing (MCC) integrates the cloud computing into the mobile environment, which extends mobile devices’ battery lifetime, improves their data storage capacity and processing power, and improves their reliability and information security. In this paper, the applications of smartphon...

  3. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  4. A population-based study of childhood sexual contact in China: prevalence and long-term consequences.

    Science.gov (United States)

    Luo, Ye; Parish, William L; Laumann, Edward O

    2008-07-01

    This study provides national estimates of the prevalence of childhood sexual contact and its association with sexual well-being and psychological distress among adults in China. A national stratified probability sample of 1,519 women and 1,475 men aged 20-64 years in urban China completed a computer-administered survey in 1999-2000. The data from this survey on both adult-to-child and peer-to-peer sexual contact before age 14 were subjected to descriptive and multivariate analyses that were adjusted for both sampling weights and sampling design. The overall prevalence of reported childhood sexual contact was 4.2%, with prevalence higher among men (5.1%) than among women (3.3%) and higher among those aged 20-29 years (8.3%). Childhood sexual contact was associated with multiplex consequences, including hyper-sexuality (high levels of masturbation, thoughts about sex, varieties of sexual practices, partner turnover), adult sexual victimization (unwanted sex, unwanted sexual acts, sexual harassment), sexual difficulties (genitor-urinary symptoms, sexually transmitted infections, sexual dysfunctions), and psychological distress. Psychological distress was largely mediated by adult sexual victimization, sexual difficulties, and hyper-sexuality. Despite the relatively modest prevalence of childhood sexual contact among Chinese adults, the association with multiplex adult outcomes suggests that much as in the West early sexual contact is a significant issue. The findings underscore the importance of public education about childhood sexual contact and abuse in China. The findings suggest a need for public health campaigns that tackle the stigma associated with being abused and encourage victims to report abusive behavior to proper sources. The findings are also consistent with new efforts to alleviate the negative long-term impact of childhood sexual abuse.

  5. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  6. Virtual fragment preparation for computational fragment-based drug design.

    Science.gov (United States)

    Ludington, Jennifer L

    2015-01-01

    Fragment-based drug design (FBDD) has become an important component of the drug discovery process. The use of fragments can accelerate both the search for a hit molecule and the development of that hit into a lead molecule for clinical testing. In addition to experimental methodologies for FBDD such as NMR and X-ray Crystallography screens, computational techniques are playing an increasingly important role. The success of the computational simulations is due in large part to how the database of virtual fragments is prepared. In order to prepare the fragments appropriately it is necessary to understand how FBDD differs from other approaches and the issues inherent in building up molecules from smaller fragment pieces. The ultimate goal of these calculations is to link two or more simulated fragments into a molecule that has an experimental binding affinity consistent with the additive predicted binding affinities of the virtual fragments. Computationally predicting binding affinities is a complex process, with many opportunities for introducing error. Therefore, care should be taken with the fragment preparation procedure to avoid introducing additional inaccuracies.This chapter is focused on the preparation process used to create a virtual fragment database. Several key issues of fragment preparation which affect the accuracy of binding affinity predictions are discussed. The first issue is the selection of the two-dimensional atomic structure of the virtual fragment. Although the particular usage of the fragment can affect this choice (i.e., whether the fragment will be used for calibration, binding site characterization, hit identification, or lead optimization), general factors such as synthetic accessibility, size, and flexibility are major considerations in selecting the 2D structure. Other aspects of preparing the virtual fragments for simulation are the generation of three-dimensional conformations and the assignment of the associated atomic point charges.

  7. Upgrading of analogue gamma cameras with PC based computer system

    International Nuclear Information System (INIS)

    Fidler, V.; Prepadnik, M.

    2002-01-01

    Full text: Dedicated nuclear medicine computers for acquisition and processing of images from analogue gamma cameras in developing countries are in many cases faulty and technologically obsolete. The aim of the upgrading project of International Atomic Energy Agency (IAEA) was to support the development of the PC based computer system which would cost 5.000 $ in total. Several research institutions from different countries (China, Cuba, India and Slovenia) were financially supported in this development. The basic demands for the system were: one acquisition card an ISA bus, image resolution up to 256x256, SVGA graphics, low count loss at high count rates, standard acquisition and clinical protocols incorporated in PIP (Portable Image Processing), on-line energy and uniformity correction, graphic printing and networking. The most functionally stable acquisition system tested on several international workshops and university clinics was the Slovenian one with a complete set of acquisition and clinical protocols, transfer of scintigraphic data from acquisition card to PC through PORT, count loss less than 1 % at count rate of 120 kc/s, improvement of integral uniformity index by a factor of 3-5 times, reporting, networking and archiving solutions for simple MS network or server oriented network systems (NT server, etc). More than 300 gamma cameras in 52 countries were digitized and put in the routine work. The project of upgrading the analogue gamma cameras yielded a high promotion of nuclear medicine in the developing countries by replacing the old computer systems, improving the technological knowledge of end users on workshops and training courses and lowering the maintenance cost of the departments. (author)

  8. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    Science.gov (United States)

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  9. Commentary on: "Toward Computer-Based Support of Metacognitive Skills: A Computational Framework to Coach Self Explanation"

    Science.gov (United States)

    Conati, Cristina

    2016-01-01

    This paper is a commentary on "Toward Computer-Based Support of Meta-Cognitive Skills: a Computational Framework to Coach Self-Explanation", by Cristina Conati and Kurt Vanlehn, published in the "IJAED" in 2000 (Conati and VanLehn 2010). This work was one of the first examples of Intelligent Learning Environments (ILE) that…

  10. An integrated computer-based procedure for teamwork in digital nuclear power plants.

    Science.gov (United States)

    Gao, Qin; Yu, Wenzhu; Jiang, Xiang; Song, Fei; Pan, Jiajie; Li, Zhizhong

    2015-01-01

    Computer-based procedures (CBPs) are expected to improve operator performance in nuclear power plants (NPPs), but they may reduce the openness of interaction between team members and harm teamwork consequently. To support teamwork in the main control room of an NPP, this study proposed a team-level integrated CBP that presents team members' operation status and execution histories to one another. Through a laboratory experiment, we compared the new integrated design and the existing individual CBP design. Sixty participants, randomly divided into twenty teams of three people each, were assigned to the two conditions to perform simulated emergency operating procedures. The results showed that compared with the existing CBP design, the integrated CBP reduced the effort of team communication and improved team transparency. The results suggest that this novel design is effective to optim team process, but its impact on the behavioural outcomes may be moderated by more factors, such as task duration. The study proposed and evaluated a team-level integrated computer-based procedure, which present team members' operation status and execution history to one another. The experimental results show that compared with the traditional procedure design, the integrated design reduces the effort of team communication and improves team transparency.

  11. Computer-based irrigation scheduling for cotton crop

    International Nuclear Information System (INIS)

    Laghari, K.Q.; Memon, H.M.

    2008-01-01

    In this study a real time irrigation schedule for cotton crop has been tested using mehran model, a computer-based DDS (Decision Support System). The irrigation schedule was set on selected MAD (Management Allowable Depletion) and the current root depth position. The total 451 mm irrigation water applied to the crop field. The seasonal computed crop ET (Evapotranspiration) was estimated 421.32 mm and actual (ET/sub ca/) observed was 413 mm. The model over-estimated seasonal ET by only 1.94. WUE (Water Use Efficiency) for seed-cotton achieved 6.59 Kg (ha mm)/sup -1/. The statistical analysis (R/sup 2/=0.96, ARE%=2.00, T-1.17 and F=550.57) showed good performance of the model in simulated and observed ET values. The designed Mehran model is designed quite versatile for irrigation scheduling and can be successfully used as irrigation DSS tool for various crop types. (author)

  12. Experimental and computational studies on a gasifier based stove

    International Nuclear Information System (INIS)

    Varunkumar, S.; Rajan, N.K.S.; Mukunda, H.S.

    2012-01-01

    Highlights: ► A simple method to calculate the fraction of HHC was devised. ► η g for stove is same as that of a downdraft gasifier. ► Gas from stove contains 5.5% of CH 4 equivalent of HHC. ► Effect of vessel size on utilization efficiency brought out clearly. ► Contribution of radiative heat transfer from char bed to efficiency is 6%. - Abstract: The work reported here is concerned with a detailed thermochemical evaluation of the flaming mode behaviour of a gasifier based stove. Determination of the gas composition over the fuel bed, surface and gas temperatures in the gasification process constitute principal experimental features. A simple atomic balance for the gasification reaction combined with the gas composition from the experiments is used to determine the CH 4 equivalent of higher hydrocarbons and the gasification efficiency (η g ). The components of utilization efficiency, namely, gasification–combustion and heat transfer are explored. Reactive flow computational studies using the measured gas composition over the fuel bed are used to simulate the thermochemical flow field and heat transfer to the vessel; hither-to-ignored vessel size effects in the extraction of heat from the stove are established clearly. The overall flaming mode efficiency of the stove is 50–54%; the convective and radiative components of heat transfer are established to be 45–47 and 5–7% respectively. The efficiency estimates from reacting computational fluid dynamics (RCFD) compare well with experiments.

  13. Near infrared spectroscopy based brain-computer interface

    Science.gov (United States)

    Ranganatha, Sitaram; Hoshi, Yoko; Guan, Cuntai

    2005-04-01

    A brain-computer interface (BCI) provides users with an alternative output channel other than the normal output path of the brain. BCI is being given much attention recently as an alternate mode of communication and control for the disabled, such as patients suffering from Amyotrophic Lateral Sclerosis (ALS) or "locked-in". BCI may also find applications in military, education and entertainment. Most of the existing BCI systems which rely on the brain's electrical activity use scalp EEG signals. The scalp EEG is an inherently noisy and non-linear signal. The signal is detrimentally affected by various artifacts such as the EOG, EMG, ECG and so forth. EEG is cumbersome to use in practice, because of the need for applying conductive gel, and the need for the subject to be immobile. There is an urgent need for a more accessible interface that uses a more direct measure of cognitive function to control an output device. The optical response of Near Infrared Spectroscopy (NIRS) denoting brain activation can be used as an alternative to electrical signals, with the intention of developing a more practical and user-friendly BCI. In this paper, a new method of brain-computer interface (BCI) based on NIRS is proposed. Preliminary results of our experiments towards developing this system are reported.

  14. A border-ownership model based on computational electromagnetism.

    Science.gov (United States)

    Zainal, Zaem Arif; Satoh, Shunji

    2018-03-01

    The mathematical relation between a vector electric field and its corresponding scalar potential field is useful to formulate computational problems of lower/middle-order visual processing, specifically related to the assignment of borders to the side of the object: so-called border ownership (BO). BO coding is a key process for extracting the objects from the background, allowing one to organize a cluttered scene. We propose that the problem is solvable simultaneously by application of a theorem of electromagnetism, i.e., "conservative vector fields have zero rotation, or "curl." We hypothesize that (i) the BO signal is definable as a vector electric field with arrowheads pointing to the inner side of perceived objects, and (ii) its corresponding scalar field carries information related to perceived order in depth of occluding/occluded objects. A simple model was developed based on this computational theory. Model results qualitatively agree with object-side selectivity of BO-coding neurons, and with perceptions of object order. The model update rule can be reproduced as a plausible neural network that presents new interpretations of existing physiological results. Results of this study also suggest that T-junction detectors are unnecessary to calculate depth order. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. A Spread Willingness Computing-Based Information Dissemination Model

    Science.gov (United States)

    Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738

  16. A spread willingness computing-based information dissemination model.

    Science.gov (United States)

    Huang, Haojing; Cui, Zhiming; Zhang, Shukui

    2014-01-01

    This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  17. Safety applications of computer based systems for the process industry

    International Nuclear Information System (INIS)

    Bologna, Sandro; Picciolo, Giovanni; Taylor, Robert

    1997-11-01

    Computer based systems, generally referred to as Programmable Electronic Systems (PESs) are being increasingly used in the process industry, also to perform safety functions. The process industry as they intend in this document includes, but is not limited to, chemicals, oil and gas production, oil refining and power generation. Starting in the early 1970's the wide application possibilities and the related development problems of such systems were recognized. Since then, many guidelines and standards have been developed to direct and regulate the application of computers to perform safety functions (EWICS-TC7, IEC, ISA). Lessons learnt in the last twenty years can be summarised as follows: safety is a cultural issue; safety is a management issue; safety is an engineering issue. In particular, safety systems can only be properly addressed in the overall system context. No single method can be considered sufficient to achieve the safety features required in many safety applications. Good safety engineering approach has to address not only hardware and software problems in isolation but also their interfaces and man-machine interface problems. Finally, the economic and industrial aspects of the safety applications and development of PESs in process plants are evidenced throughout all the Report. Scope of the Report is to contribute to the development of an adequate awareness of these problems and to illustrate technical solutions applied or being developed

  18. A Spread Willingness Computing-Based Information Dissemination Model

    Directory of Open Access Journals (Sweden)

    Haojing Huang

    2014-01-01

    Full Text Available This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user’s spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.

  19. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  20. Constrained consequence

    CSIR Research Space (South Africa)

    Britz, K

    2011-09-01

    Full Text Available their basic properties and relationship. In Section 3 we present a modal instance of these constructions which also illustrates with an example how to reason abductively with constrained entailment in a causal or action oriented context. In Section 4 we... of models with the former approach, whereas in Section 3.3 we give an example illustrating ways in which C can be de ned with both. Here we employ the following versions of local consequence: De nition 3.4. Given a model M = hW;R;Vi and formulas...

  1. Systematic review of prevention and management strategies for the consequences of gender-based violence in refugee settings.

    Science.gov (United States)

    Asgary, Ramin; Emery, Eleanor; Wong, Marcia

    2013-06-01

    Uncertainties continue regarding effective strategies to prevent and address the consequences of gender-based violence (GBV) among refugees. The databases of PubMed, Cochrane Library, Scopus, PsycINFO, Web of Science, Anthropology Plus, EMBASE, DARE, Google Scholar, MSF Field Research, UNHCR and the regional and global indices of the WHO Global Health Library were searched twice within a 6-month period (April and September 2011) for English-language clinical, public health, basic and social science studies evaluating strategies to prevent and manage health sequelae of GBV among refugees before September 2011. Studies not primarily about prevention and treatment, and not describing population, health outcome and interventions, were excluded. The literature search for the prevention and management arms produced 1212 and 1106 results, respectively. After reviewing the titles and abstracts, 29 and 27 articles were selected for review in their entirety, none of which met the inclusion criteria. Multiple panels of expert recommendations and guidelines were not supported by primary data on actual displaced populations. There is a dire need for research that evaluates the efficacy and effectiveness of various responses to GBV to ultimately allow a transition from largely theoretical and expertise driven to a more evidence-based field. We recommend strategies to improve data collection and to overcome barriers in primary data driven research.

  2. The fiscal consequences of ADHD in Germany: a quantitative analysis based on differences in educational attainment and lifetime earnings.

    Science.gov (United States)

    Kotsopoulos, Nikolaos; Connolly, Mark P; Sobanski, Esther; Postma, Maarten J

    2013-03-01

    To estimate the long-term fiscal consequences of attention deficit hyperactivity disorder (ADHD) on the German government and social insurance system based on differences in educational attainment and the resulting differences in lifetime earnings compared with non-ADHD cohorts. Differences in educational attainment between ADHD and non-ADHD cohorts were linked to education-specific earnings data. Direct and indirect tax rates and social insurance contributions were linked to differences in lifetime, education-specific earnings to derive lost tax revenue in Germany associated with ADHD. For ADHD and non-ADHD cohorts we derived the age-specific discounted net taxes paid by deducting lifetime transfers from lifetime gross taxes paid. The lifetime net tax revenue for a non-ADHD individual was approximately EUR 80,000 higher compared to an untreated ADHD individual. The fiscal burden of untreated ADHD, based on a cohort of n=31,844 born in 2010, was estimated at EUR 2.5 billion in net tax revenue losses compared with an equally-sized non-ADHD cohort. ADHD interventions providing a small improvement in educational attainment resulted in fiscal benefits from increases in lifetime tax gains. ADHD results in long-term financial loss due to lower education attainment and lifetime reduced earnings and resulting lifetime taxes and social contributions paid. Investments in ADHD interventions allowing more children to achieve their educational potential may offer fiscal benefits generating a positive rate of return.

  3. Supporting plant operation through computer-based procedures

    International Nuclear Information System (INIS)

    Martinez, Victor; Medrano, Javier; Mendez, Julio

    2014-01-01

    Digital Systems are becoming more important in controlling and monitoring nuclear power plant operations. The capabilities of these systems provide additional functions as well as support operators in making decisions and avoiding errors. Regarding Operation Support Systems, an important way of taking advantage of these features is using computer-based procedures (CBPs) tools that enhance the plant operation. Integrating digital systems in analogue controls at nuclear power plants in operation becomes an extra challenge, in contrast to the integration of Digital Control Systems in new nuclear power plants. Considering the potential advantages of using this technology, Tecnatom has designed and developed a CBP platform taking currently operating nuclear power plants as its design basis. The result is a powerful tool which combines the advantages of CBPs and the conventional analogue control systems minimizing negative effects during plant operation and integrating operation aid-systems to support operators. (authors)

  4. Edge detection based on computational ghost imaging with structured illuminations

    Science.gov (United States)

    Yuan, Sheng; Xiang, Dong; Liu, Xuemei; Zhou, Xin; Bing, Pibin

    2018-03-01

    Edge detection is one of the most important tools to recognize the features of an object. In this paper, we propose an optical edge detection method based on computational ghost imaging (CGI) with structured illuminations which are generated by an interference system. The structured intensity patterns are designed to make the edge of an object be directly imaged from detected data in CGI. This edge detection method can extract the boundaries for both binary and grayscale objects in any direction at one time. We also numerically test the influence of distance deviations in the interference system on edge extraction, i.e., the tolerance of the optical edge detection system to distance deviation. Hopefully, it may provide a guideline for scholars to build an experimental system.

  5. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  6. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  7. Computer aided design of nickel-base superalloys

    International Nuclear Information System (INIS)

    Lawrence, P.J.

    1988-01-01

    This paper describes a computer aided design process for Ni-base superalloys developed and employed at ASEA Brown Boveri. The technique involves a series of modules each of which predicts a particular property of a hypothetical new composition. In the first stage of the development of this design techniques modules were produced to predict phase stability, using PHACOMP, and high temperature creep strength and hot corrosion resistance, using multiple linear regression equations derived from the data in the literature. Alloys designed using these technique are also discussed and, in particular, shortcomings of the design process are highlighted. This information was then used to produce a revamped design methodology involving extra modules, including prediction of an alloy's gamma-prime content. (orig.)

  8. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  9. A Cloud Computing Based Patient Centric Medical Information System

    Science.gov (United States)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  10. FPGA based compute nodes for high level triggering in PANDA

    International Nuclear Information System (INIS)

    Kuehn, W; Gilardi, C; Kirschner, D; Lang, J; Lange, S; Liu, M; Perez, T; Yang, S; Schmitt, L; Jin, D; Li, L; Liu, Z; Lu, Y; Wang, Q; Wei, S; Xu, H; Zhao, D; Korcyl, K; Otwinowski, J T; Salabura, P

    2008-01-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 10 7 /s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware

  11. Microarray-based cancer prediction using soft computing approach.

    Science.gov (United States)

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  12. Overlapped flowers yield detection using computer-based interface

    Directory of Open Access Journals (Sweden)

    Anuradha Sharma

    2016-09-01

    Full Text Available Precision agriculture has always dealt with the accuracy and timely information about agricultural products. With the help of computer hardware and software technology designing a decision support system that could generate flower yield information and serve as base for management and planning of flower marketing is made so easy. Despite such technologies, some problem still arise, for example, a colour homogeneity of a specimen which cannot be obtained similar to actual colour of image and overlapping of image. In this paper implementing a new ‘counting algorithm’ for overlapped flower is being discussed. For implementing this algorithm, some techniques and operations such as colour image segmentation technique, image segmentation, using HSV colour space and morphological operations have been used. In this paper used two most popular colour space; those are RGB and HSV. HSV colour space decouples brightness from a chromatic component in the image, by which it provides better result in case for occlusion and overlapping.

  13. PACS-Based Computer-Aided Detection and Diagnosis

    Science.gov (United States)

    Huang, H. K. (Bernie); Liu, Brent J.; Le, Anh HongTu; Documet, Jorge

    The ultimate goal of Picture Archiving and Communication System (PACS)-based Computer-Aided Detection and Diagnosis (CAD) is to integrate CAD results into daily clinical practice so that it becomes a second reader to aid the radiologist's diagnosis. Integration of CAD and Hospital Information System (HIS), Radiology Information System (RIS) or PACS requires certain basic ingredients from Health Level 7 (HL7) standard for textual data, Digital Imaging and Communications in Medicine (DICOM) standard for images, and Integrating the Healthcare Enterprise (IHE) workflow profiles in order to comply with the Health Insurance Portability and Accountability Act (HIPAA) requirements to be a healthcare information system. Among the DICOM standards and IHE workflow profiles, DICOM Structured Reporting (DICOM-SR); and IHE Key Image Note (KIN), Simple Image and Numeric Report (SINR) and Post-processing Work Flow (PWF) are utilized in CAD-HIS/RIS/PACS integration. These topics with examples are presented in this chapter.

  14. Computer Based Procedures for Field Workers - FY16 Research Activities

    International Nuclear Information System (INIS)

    Oxstrand, Johanna; Bly, Aaron

    2016-01-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages - Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  15. Computer Based Procedures for Field Workers - FY16 Research Activities

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  16. Simulation-based artifact correction (SBAC) for metrological computed tomography

    Science.gov (United States)

    Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc

    2017-06-01

    Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.

  17. Iris features-based heart disease diagnosis by computer vision

    Science.gov (United States)

    Nguchu, Benedictor A.; Li, Li

    2017-07-01

    The study takes advantage of several new breakthroughs in computer vision technology to develop a new mid-irisbiomedical platform that processes iris image for early detection of heart-disease. Guaranteeing early detection of heart disease provides a possibility of having non-surgical treatment as suggested by biomedical researchers and associated institutions. However, our observation discovered that, a clinical practicable solution which could be both sensible and specific for early detection is still lacking. Due to this, the rate of majority vulnerable to death is highly increasing. The delayed diagnostic procedures, inefficiency, and complications of available methods are the other reasons for this catastrophe. Therefore, this research proposes the novel IFB (Iris Features Based) method for diagnosis of premature, and early stage heart disease. The method incorporates computer vision and iridology to obtain a robust, non-contact, nonradioactive, and cost-effective diagnostic tool. The method analyzes abnormal inherent weakness in tissues, change in color and patterns, of a specific region of iris that responds to impulses of heart organ as per Bernard Jensen-iris Chart. The changes in iris infer the presence of degenerative abnormalities in heart organ. These changes are precisely detected and analyzed by IFB method that includes, tensor-based-gradient(TBG), multi orientations gabor filters(GF), textural oriented features(TOF), and speed-up robust features(SURF). Kernel and Multi class oriented support vector machines classifiers are used for classifying normal and pathological iris features. Experimental results demonstrated that the proposed method, not only has better diagnostic performance, but also provides an insight for early detection of other diseases.

  18. The Effectiveness of a School-Based Intervention for Adolescents in Reducing Disparities in the Negative Consequences of Substance Use Among Ethnic Groups.

    Science.gov (United States)

    Stewart, David G; Moise-Campbell, Claudine; Chapman, Meredith K; Varma, Malini; Lehinger, Elizabeth

    2017-06-01

    Ethnic minority youth are disproportionately affected by substance use-related consequences, which may be best understood through a social ecological lens. Differences in psychosocial consequences between ethnic majority and minority groups are likely due to underlying social and environmental factors. The current longitudinal study examined the outcomes of a school-based motivational enhancement treatment intervention in reducing disparities in substance use consequences experienced by some ethnic minority groups with both between and within-subjects differences. Students were referred to the intervention through school personnel and participated in a four-session intervention targeting alcohol and drug use. Participants included 122 youth aged 13-19 years. Participants were grouped by ethnicity and likelihood of disparate negative consequences of substance use. African American/Hispanic/Multiethnic youth formed one group, and youth identifying as White or Asian formed a second group. We hypothesized that (1) there would be significant disparities in psychosocial, serious problem behavior, and school-based consequences of substance use between White/Asian students compared to African American/Hispanic/Multiethnic students at baseline; (2) physical dependence consequences would not be disparate at baseline; and (3) overall disparities would be reduced at post-treatment follow-up. Results indicated that African American/Hispanic/Multiethnic adolescents demonstrated statistically significant disparate consequences at baseline, except for physical dependency consequences. Lastly, significant reductions in disparities were evidenced between groups over time. Our findings highlight the efficacy of utilizing school-based substance use interventions in decreasing ethnic health disparities in substance use consequences.

  19. A Computer-based 21st Century Prototype

    Directory of Open Access Journals (Sweden)

    Pannathon Sangarun

    2015-01-01

    Full Text Available Abstract This paper describes a prototype computer-based reading comprehension program. It begins with a short description, at a general level, of theoretical issues relating to the learning of comprehension skills in a foreign/second language learning. These issues cover such areas as personal meaning-making on the basis of individual differences and the need for individualized intervention to maximize the comprehension process. Modern technology facilitates this process and enables simultaneous support of large numbers of students. Specifically, from a learning perspective, the program focuses on students’ personal understandings while, from a reading perspective, the construction of meaning is based on an interactive model where both high-level (global, inferential structures are elicited/studied as well as low-level structures (e.g. vocabulary, grammar. These principles are strengthened with research findings from studies in awareness and language processing based on eye-movement analysis. As part of its reading comprehensions focus, the system also has a strong commitment to the development of critical thinking skills, recognized as one of the most important 21st Century skills. The program is then described in detail, including its ability to store students’ responses and to be administered through standard learning management systems. Finally, an outline of planned future developments and enhancements is presented.

  20. Optical computation based on nonlinear total reflectional optical ...

    Indian Academy of Sciences (India)

    2School of Education Science, South China Normal University, Guangzhou, 510631, China. *Corresponding ... Before the computation, all the inputs are prepared in the polarization state. The key .... The all-optical computing system described.

  1. Phoneme-based speech segmentation using hybrid soft computing framework

    CERN Document Server

    Sarma, Mousmita

    2014-01-01

    The book discusses intelligent system design using soft computing and similar systems and their interdisciplinary applications. It also focuses on the recent trends to use soft computing as a versatile tool for designing a host of decision support systems.

  2. A computationally efficient approach for template matching-based ...

    Indian Academy of Sciences (India)

    In this paper, a new computationally efficient image registration method is ...... the proposed method requires less computational time as compared to traditional methods. ... Zitová B and Flusser J 2003 Image registration methods: A survey.

  3. Optical computation based on nonlinear total reflectional optical ...

    Indian Academy of Sciences (India)

    Optical computing; beam splitter; optical switch; polarized beams. ... main research direction called quantum information and quantum computation is .... above has several advantages: Firstly, it is easy to be integrated with appropriate.

  4. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    OpenAIRE

    Mohammad Mohammadi; Masoud Barzgaran

    2010-01-01

    Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.),the question may be whether the two modes of computer- and paper-based te...

  5. Brain-computer interface based on intermodulation frequency

    Science.gov (United States)

    Chen, Xiaogang; Chen, Zhikai; Gao, Shangkai; Gao, Xiaorong

    2013-12-01

    Objective. Most recent steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) systems have used a single frequency for each target, so that a large number of targets require a large number of stimulus frequencies and therefore a wider frequency band. However, human beings show good SSVEP responses only in a limited range of frequencies. Furthermore, this issue is especially problematic if the SSVEP-based BCI takes a PC monitor as a stimulator, which is only capable of generating a limited range of frequencies. To mitigate this issue, this study presents an innovative coding method for SSVEP-based BCI by means of intermodulation frequencies. Approach. Simultaneous modulations of stimulus luminance and color at different frequencies were utilized to induce intermodulation frequencies. Luminance flickered at relatively large frequency (10, 12, 15 Hz), while color alternated at low frequency (0.5, 1 Hz). An attractive feature of the proposed method was that it would substantially increase the number of targets at a single flickering frequency by altering color modulated frequencies. Based on this method, the BCI system presented in this study realized eight targets merely using three flickering frequencies. Main results. The online results obtained from 15 subjects (14 healthy and 1 with stroke) revealed that an average classification accuracy of 93.83% and information transfer rate (ITR) of 33.80 bit min-1 were achieved using our proposed SSVEP-based BCI system. Specifically, 5 out of the 15 subjects exhibited an ITR of 40.00 bit min-1 with a classification accuracy of 100%. Significance. These results suggested that intermodulation frequencies could be adopted as steady responses in BCI, for which our system could be used as a practical BCI system.

  6. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography.

    Science.gov (United States)

    Jiang, Lan; Templeton, Alistair; Turian, Julius; Kirk, Michael; Zusag, Thomas; Chu, James C H

    2011-01-01

    Identification of source positions after implantation is an important step in brachytherapy planning. Reconstruction is traditionally performed from films taken by conventional simulators, but these are gradually being replaced in the clinic by computed tomography (CT) simulators. The present study explored the use of a scout image-based reconstruction algorithm that replaces the use of traditional film, while exhibiting low sensitivity to metal-induced artifacts that can appear in 3D CT methods. In addition, the accuracy of an in-house graphical software implementation of scout-based reconstruction was compared with seed location reconstructions for 2 phantoms by conventional simulator and CT measurements. One phantom was constructed using a planar fixed grid of 1.5-mm diameter ball bearings (BBs) with 40-mm spacing. The second was a Fletcher-Suit applicator embedded in Styrofoam (Dow Chemical Co., Midland, MI) with one 3.2-mm-diameter BB inserted into each of 6 surrounding holes. Conventional simulator, kilovoltage CT (kVCT), megavoltage CT, and scout-based methods were evaluated by their ability to calculate the distance between seeds (40 mm for the fixed grid, 30-120 mm in Fletcher-Suit). All methods were able to reconstruct the fixed grid distances with an average deviation of <1%. The worst single deviations (approximately 6%) were exhibited in the 2 volumetric CT methods. In the Fletcher-Suit phantom, the intermodality agreement was within approximately 3%, with the conventional sim measuring marginally larger distances, with kVCT the smallest. All of the established reconstruction methods exhibited similar abilities to detect the distances between BBs. The 3D CT-based methods, with lower axial resolution, showed more variation, particularly with the smaller BBs. With a software implementation, scout-based reconstruction is an appealing approach because it simplifies data acquisition over film-based reconstruction without requiring any specialized equipment

  7. Remedial action assessment system (RAAS) - A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    Buelt, J.L.; Stottlemyre, J.A.; White, M.K.

    1991-01-01

    Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology process options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). One of the greatest attributes of the RAAS project is that the computer interface with the user is being designed to be friendly, intuitive, and interactive. Consequently, the user interface employs menus, windows, help features, and graphical information while RAAS is in operation. During operation, each technology process option is represented by an open-quotes objectclose quotes module. Object-oriented programming is then used to link these unit processes into remedial alternatives. In this way, various object modules representing technology process options can communicate so that a linked set of compatible processes form an appropriate remedial alternative. Once the remedial alternatives are formed, they can be evaluated in terms of effectiveness, implementability, and cost

  8. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  9. Choice & Consequence

    DEFF Research Database (Denmark)

    Khan, Azam

    to support hypothesis generation, hypothesis testing, and decision making. In addition to sensors in buildings, infrastructure, or the environment, we also propose the instrumentation of user interfaces to help measure performance in decision making applications. We show the benefits of applying principles...... between cause and effect in complex systems complicates decision making. To address this issue, we examine the central role that data-driven decision making could play in critical domains such as sustainability or medical treatment. We developed systems for exploratory data analysis and data visualization...... of data analysis and instructional interface design, to both simulation systems and decision support interfaces. We hope that projects such as these will help people to understand the link between their choices and the consequences of their decisions....

  10. Reconfigurable computing the theory and practice of FPGA-based computation

    CERN Document Server

    Hauck, Scott

    2010-01-01

    Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design- the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardwa

  11. Hand eczema in hairdressers: a Danish register-based study of the prevalence of hand eczema and its career consequences.

    Science.gov (United States)

    Lysdal, Susan Hovmand; Søsted, Heidi; Andersen, Klaus Ejner; Johansen, Jeanne Duus

    2011-09-01

    Occupational hand eczema is common in hairdressers, owing to wet work and hairdressing chemicals. To estimate the prevalence of hand eczema and its career consequences among hairdressers in Denmark. A register-based study was conducted, comprising all graduates from hairdressing vocational schools from 1985 to 2007 (n = 7840). The participants received a self-administered postal questionnaire including questions on hand eczema, atopic dermatitis, and career change. A response rate of 67.9% (n = 5324) was obtained. Of the respondents, 44.3% no longer worked as hairdressers and had worked for an average of 8.4 years in the profession before leaving it. Hand eczema was more common among ex-hairdressers (48.4%) than among current hairdressers (37.6%) (p reason for career change. In this group, logistic regression analysis showed that chronic hand eczema contributed the most to the decision to change career (odds ratio 50.12; 95% confidence interval 18.3-137). Hairdressers work an average of 8.4 years in the profession before leaving it, and hand eczema contributes significantly to this career change. © 2011 John Wiley & Sons A/S.

  12. Risk factors and consequences of maternal anaemia and elevated haemoglobin levels during pregnancy: a population-based prospective cohort study.

    Science.gov (United States)

    Gaillard, Romy; Eilers, Paul H C; Yassine, Siham; Hofman, Albert; Steegers, Eric A P; Jaddoe, Vincent W V

    2014-05-01

    To determine sociodemographic and life style-related risk factors and trimester specific maternal, placental, and fetal consequences of maternal anaemia and elevated haemoglobin levels in pregnancy. In a population-based prospective cohort study of 7317 mothers, we measured haemoglobin levels in early pregnancy [gestational age median 14.4 weeks (inter-quartile-range 12.5-17.5)]. Anaemia (haemoglobin ≤11 g/dl) and elevated haemoglobin levels (haemoglobin ≥13.2 g/dl) were defined according to the WHO criteria. Maternal blood pressure, placental function and fetal growth were measured in each trimester. Data on gestational hypertensive disorders and birth outcomes was collected from hospitals. Older maternal age, higher body mass index, primiparity and European descent were associated with higher haemoglobin levels (P pregnancy (mean differences 5.1 mmHg, 95% confidence interval [CI] 3.8, 6.5 and 4.1 mmHg, 95% CI 3.0, 5.2, respectively) and with a higher risk of third trimester uterine artery notching (RR 1.3, 95% CI 1.0, 1.7). As compared with maternal normal haemoglobin levels, not anaemia, but elevated haemoglobin levels were associated with fetal head circumference, length, and weight growth restriction from third trimester onwards (P pregnancy. Elevated haemoglobin levels are associated with increased risks of maternal, placental, and fetal complications. © 2014 John Wiley & Sons Ltd.

  13. Computer-based multisensory learning in children with developmental dyslexia.

    Science.gov (United States)

    Kast, Monika; Meyer, Martin; Vögeli, Christian; Gross, Markus; Jäncke, Lutz

    2007-01-01

    Several attempts have been made to remediate developmental dyslexia using various training environments. Based on the well-known retrieval structure model, the memory strength of phonemes and graphemes should be strengthened by visual and auditory associations between graphemes and phonemes. Using specifically designed training software, we examined whether establishing a multitude of visuo-auditory associations might help to mitigate writing errors in children with developmental dyslexia. Forty-three children with developmental dyslexia and 37 carefully matched normal reading children performed a computer-based writing training (15-20 minutes 4 days a week) for three months with the aim to recode a sequential textual input string into a multi-sensory representation comprising visual and auditory codes (including musical tones). The study included four matched groups: a group of children with developmental dyslexia (n=20) and a control group (n=18) practiced with the training software in the first period (3 months, 15-20 minutes 4 days a week), while a second group of children with developmental dyslexia (n=23) (waiting group) and a second control group (n=19) received no training during the first period. In the second period the children with developmental dyslexia and controls who did not receive training during the first period now took part in the training. Children with developmental dyslexia who did not perform computer-based training during the first period hardly improved their writing skills (post-pre improvement of 0-9%), the dyslexic children receiving training strongly improved their writing skills (post-pre improvement of 19-35%). The group who did the training during the second period also revealed improvement of writing skills (post-pre improvement of 27-35%). Interestingly, we noticed a strong transfer from trained to non-trained words in that the children who underwent the training were also better able to write words correctly that were not part

  14. Computer-Based Auditory Training Programs for Children with Hearing Impairment – A Scoping Review

    Science.gov (United States)

    Nanjundaswamy, Manohar; Prabhu, Prashanth; Rajanna, Revathi Kittur; Ningegowda, Raghavendra Gulaganji; Sharma, Madhuri

    2018-01-01

    Introduction  Communication breakdown, a consequence of hearing impairment (HI), is being fought by fitting amplification devices and providing auditory training since the inception of audiology. The advances in both audiology and rehabilitation programs have led to the advent of computer-based auditory training programs (CBATPs). Objective  To review the existing literature documenting the evidence-based CBATPs for children with HIs. Since there was only one such article, we also chose to review the commercially available CBATPs for children with HI. The strengths and weaknesses of the existing literature were reviewed in order to improve further researches. Data Synthesis  Google Scholar and PubMed databases were searched using various combinations of keywords. The participant, intervention, control, outcome and study design (PICOS) criteria were used for the inclusion of articles. Out of 124 article abstracts reviewed, 5 studies were shortlisted for detailed reading. One among them satisfied all the criteria, and was taken for review. The commercially available programs were chosen based on an extensive search in Google. The reviewed article was well-structured, with appropriate outcomes. The commercially available programs cover many aspects of the auditory training through a wide range of stimuli and activities. Conclusions  There is a dire need for extensive research to be performed in the field of CBATPs to establish their efficacy, also to establish them as evidence-based practices. PMID:29371904

  15. Computer-Based Auditory Training Programs for Children with Hearing Impairment - A Scoping Review.

    Science.gov (United States)

    Nanjundaswamy, Manohar; Prabhu, Prashanth; Rajanna, Revathi Kittur; Ningegowda, Raghavendra Gulaganji; Sharma, Madhuri

    2018-01-01

    Introduction  Communication breakdown, a consequence of hearing impairment (HI), is being fought by fitting amplification devices and providing auditory training since the inception of audiology. The advances in both audiology and rehabilitation programs have led to the advent of computer-based auditory training programs (CBATPs). Objective  To review the existing literature documenting the evidence-based CBATPs for children with HIs. Since there was only one such article, we also chose to review the commercially available CBATPs for children with HI. The strengths and weaknesses of the existing literature were reviewed in order to improve further researches. Data Synthesis  Google Scholar and PubMed databases were searched using various combinations of keywords. The participant, intervention, control, outcome and study design (PICOS) criteria were used for the inclusion of articles. Out of 124 article abstracts reviewed, 5 studies were shortlisted for detailed reading. One among them satisfied all the criteria, and was taken for review. The commercially available programs were chosen based on an extensive search in Google. The reviewed article was well-structured, with appropriate outcomes. The commercially available programs cover many aspects of the auditory training through a wide range of stimuli and activities. Conclusions  There is a dire need for extensive research to be performed in the field of CBATPs to establish their efficacy, also to establish them as evidence-based practices.

  16. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.

    2006-01-01

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  17. Comparison of Computer-Based Versus Counselor-Based Occupational Information Systems with Disadvantaged Vocational Students

    Science.gov (United States)

    Maola, Joseph; Kane, Gary

    1976-01-01

    Subjects, who were Occupational Work Experience students, were randomly assigned to individual guidance from either a computerized occupational information system, to a counselor-based information system or to a control group. Results demonstrate a hierarchical learning effect: The computer group learned more than the counseled group, which…

  18. Computational Model-Based Design of Leadership Support Based on Situational Leadership Theory

    NARCIS (Netherlands)

    Bosse, T.; Duell, R.; Memon, Z.A.; Treur, J.; van der Wal, C.N.

    2017-01-01

    This paper introduces the design of an agent-based leadership support system exploiting a computational model for development of individuals or groups. It is to be used, for example, as a basis for systems to support a group leader in the development of individual group members or a group as a

  19. Evaluating Computer-Based Assessment in a Risk-Based Model

    Science.gov (United States)

    Zakrzewski, Stan; Steven, Christine; Ricketts, Chris

    2009-01-01

    There are three purposes for evaluation: evaluation for action to aid the decision making process, evaluation for understanding to further enhance enlightenment and evaluation for control to ensure compliance to standards. This article argues that the primary function of evaluation in the "Catherine Wheel" computer-based assessment (CBA)…

  20. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    Cheremisina Lyubov

    2014-11-01

    Full Text Available This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the second – a comparison of the names of concepts based on their lexical similarity and using special dictionaries. The analysis of languages available for the formal description of ontology. Considered a formal mathematical model of ontology’s and ontology consistency problem, which is that different developers for the same domain ontology can be created, syntactically or semantically heterogeneous, and their use requires a compatible broadcast or display. An algorithm combining ontology’s. The characteristic of the practical value of developing an ontology for electronic educational resources and recommendations for further research and development, such as implementation of other components of the system integration, formalization of the processes of integration and development of a universal expansion algorithms ontology’s software

  1. A computer vision based candidate for functional balance test.

    Science.gov (United States)

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  2. Towards SSVEP-based, portable, responsive Brain-Computer Interface.

    Science.gov (United States)

    Kaczmarek, Piotr; Salomon, Pawel

    2015-08-01

    A Brain-Computer Interface in motion control application requires high system responsiveness and accuracy. SSVEP interface consisted of 2-8 stimuli and 2 channel EEG amplifier was presented in this paper. The observed stimulus is recognized based on a canonical correlation calculated in 1 second window, ensuring high interface responsiveness. A threshold classifier with hysteresis (T-H) was proposed for recognition purposes. Obtained results suggest that T-H classifier enables to significantly increase classifier performance (resulting in accuracy of 76%, while maintaining average false positive detection rate of stimulus different then observed one between 2-13%, depending on stimulus frequency). It was shown that the parameters of T-H classifier, maximizing true positive rate, can be estimated by gradient-based search since the single maximum was observed. Moreover the preliminary results, performed on a test group (N=4), suggest that for T-H classifier exists a certain set of parameters for which the system accuracy is similar to accuracy obtained for user-trained classifier.

  3. Diagnostic reliability of MMPI-2 computer-based test interpretations.

    Science.gov (United States)

    Pant, Hina; McCabe, Brian J; Deskovitz, Mark A; Weed, Nathan C; Williams, John E

    2014-09-01

    Reflecting the common use of the MMPI-2 to provide diagnostic considerations, computer-based test interpretations (CBTIs) also typically offer diagnostic suggestions. However, these diagnostic suggestions can sometimes be shown to vary widely across different CBTI programs even for identical MMPI-2 profiles. The present study evaluated the diagnostic reliability of 6 commercially available CBTIs using a 20-item Q-sort task developed for this study. Four raters each sorted diagnostic classifications based on these 6 CBTI reports for 20 MMPI-2 profiles. Two questions were addressed. First, do users of CBTIs understand the diagnostic information contained within the reports similarly? Overall, diagnostic sorts of the CBTIs showed moderate inter-interpreter diagnostic reliability (mean r = .56), with sorts for the 1/2/3 profile showing the highest inter-interpreter diagnostic reliability (mean r = .67). Second, do different CBTIs programs vary with respect to diagnostic suggestions? It was found that diagnostic sorts of the CBTIs had a mean inter-CBTI diagnostic reliability of r = .56, indicating moderate but not strong agreement across CBTIs in terms of diagnostic suggestions. The strongest inter-CBTI diagnostic agreement was found for sorts of the 1/2/3 profile CBTIs (mean r = .71). Limitations and future directions are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. Computer Based Porosity Design by Multi Phase Topology Optimization

    Science.gov (United States)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  5. Computer based aids for operator support in nuclear power plants

    International Nuclear Information System (INIS)

    1990-04-01

    In the framework of the Agency's programme on nuclear safety a survey was carried out based on a questionnaire to collect information on computer based aids for operator support in nuclear power plants in Member States. The intention was to put together a state-of-the-art report where different systems under development or already implemented would be described. This activity was also supported by an INSAG (International Nuclear Safety Advisory Group) recommendation. Two consultant's meetings were convened and their work is reflected in the two sections of the technical document. The first section, produced during the first meeting, is devoted to provide some general background material on the overall usability of Computerized Operator Decision Aids (CODAs), their advantages and shortcomings. During this first meeting, the first draft of the questionnaire was also produced. The second section presents the evaluation of the 40 questionnaires received from 11 Member States and comprises a short description of each system and some statistical and comparative observations. The ultimate goal of this activity was to inform Member States, particularly those who are considering implementation of a CODA, on the status of related developments elsewhere. 8 refs, 10 figs, 4 tabs

  6. ARCHITECTURE OF WEB BASED COMPUTER-AIDED MANUFACTURING SYSTEM

    Directory of Open Access Journals (Sweden)

    N. E. Filyukov

    2014-09-01

    Full Text Available The paper deals with design of a web-based system for Computer-Aided Manufacturing (CAM. Remote applications and databases located in the "private cloud" are proposed to be the basis of such system. The suggested approach contains: service - oriented architecture, using web applications and web services as modules, multi-agent technologies for implementation of information exchange functions between the components of the system and the usage of PDM - system for managing technology projects within the CAM. The proposed architecture involves CAM conversion into the corporate information system that will provide coordinated functioning of subsystems based on a common information space, as well as parallelize collective work on technology projects and be able to provide effective control of production planning. A system has been developed within this architecture which gives the possibility for a rather simple technological subsystems connect to the system and implementation of their interaction. The system makes it possible to produce CAM configuration for a particular company on the set of developed subsystems and databases specifying appropriate access rights for employees of the company. The proposed approach simplifies maintenance of software and information support for CAM subsystems due to their central location in the data center. The results can be used as a basis for CAM design and testing within the learning process for development and modernization of the system algorithms, and then can be tested in the extended enterprise.

  7. fNIRS-based brain-computer interfaces: a review

    Directory of Open Access Journals (Sweden)

    Noman eNaseer

    2015-01-01

    Full Text Available A brain-computer interface (BCI is a communication system that allows the use of brain activity to control computers or other external devices. It can, by bypassing the peripheral nervous system, provide a means of communication for people suffering from severe motor disabilities or in a persistent vegetative state. In this paper, brain-signal generation tasks, noise removal methods, feature extraction/selection schemes, and classification techniques for fNIRS-based BCI are reviewed. The most common brain areas for fNIRS BCI are the primary motor cortex and the prefrontal cortex. In relation to the motor cortex, motor imagery tasks were preferred to motor execution tasks since possible proprioceptive feedback could be avoided. In relation to the prefrontal cortex, fNIRS showed a significant advantage due to no hair in detecting the cognitive tasks like mental arithmetic, music imagery, emotion induction, etc. In removing physiological noise in fNIRS data, band-pass filtering was mostly used. However, more advanced techniques like adaptive filtering, independent component analysis, multi optodes arrangement, etc. are being pursued to overcome the problem that a band-pass filter cannot be used when both brain and physiological signals occur within a close band. In extracting features related to the desired brain signal, the mean, variance, peak value, slope, skewness, and kurtosis of the noised-removed hemodynamic response were used. For classification, the linear discriminant analysis method provided simple but good performance among others: support vector machine, hidden Markov model, artificial neural network, etc. fNIRS will be more widely used to monitor the occurrence of neuro-plasticity after neuro-rehabilitation and neuro-stimulation. Technical breakthroughs in the future are expected via bundled-type probes, hybrid EEG-fNIRS BCI, and through the detection of initial dips.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  10. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  11. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    Science.gov (United States)

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  12. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    Directory of Open Access Journals (Sweden)

    Ye Fang

    Full Text Available Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU. First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  13. Accident consequence assessment code development

    International Nuclear Information System (INIS)

    Homma, T.; Togawa, O.

    1991-01-01

    This paper describes the new computer code system, OSCAAR developed for off-site consequence assessment of a potential nuclear accident. OSCAAR consists of several modules which have modeling capabilities in atmospheric transport, foodchain transport, dosimetry, emergency response and radiological health effects. The major modules of the consequence assessment code are described, highlighting the validation and verification of the models. (author)

  14. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  15. Improving Patient Satisfaction Through Computer-Based Questionnaires.

    Science.gov (United States)

    Smith, Matthew J; Reiter, Michael J; Crist, Brett D; Schultz, Loren G; Choma, Theodore J

    2016-01-01

    Patient-reported outcome measures are helping clinicians to use evidence-based medicine in decision making. The use of computer-based questionnaires to gather such data may offer advantages over traditional paper-based methods. These advantages include consistent presentation, prompts for missed questions, reliable scoring, and simple and accurate transfer of information into databases without manual data entry. The authors enrolled 308 patients over a 16-month period from 3 orthopedic clinics: spine, upper extremity, and trauma. Patients were randomized to complete either electronic or paper validated outcome forms during their first visit, and they completed the opposite modality at their second visit, which was approximately 7 weeks later. For patients with upper-extremity injuries, the Penn Shoulder Score (PSS) was used. For patients with lower-extremity injuries, the Foot Function Index (FFI) was used. For patients with lumbar spine symptoms, the Oswestry Disability Index (ODI) was used. All patients also were asked to complete the 36-Item Short Form Health Survey (SF-36) Health Status Survey, version 1. The authors assessed patient satisfaction with each survey modality and determined potential advantages and disadvantages for each. No statistically significant differences were found between the paper and electronic versions for patient-reported outcome data. However, patients strongly preferred the electronic surveys. Additionally, the paper forms had significantly more missed questions for the FFI (P<.0001), ODI (P<.0001), and PSS (P=.008), and patents were significantly less likely to complete these forms (P<.0001). Future research should focus on limiting the burden on responders, individualizing forms and questions as much as possible, and offering alternative environments for completion (home or mobile platforms). Copyright 2016, SLACK Incorporated.

  16. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    Science.gov (United States)

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  17. Enhancing Lecture Presentations in Introductory Biology with Computer-Based Multimedia.

    Science.gov (United States)

    Fifield, Steve; Peifer, Rick

    1994-01-01

    Uses illustrations and text to discuss convenient ways to organize and present computer-based multimedia to students in lecture classes. Includes the following topics: (1) Effects of illustrations on learning; (2) Using computer-based illustrations in lecture; (3) MacPresents-Multimedia Presentation Software; (4) Advantages of computer-based…

  18. Interactive, Computer-Based Training Program for Radiological Workers

    International Nuclear Information System (INIS)

    Trinoskey, P.A.; Camacho, P.I.; Wells, L.

    2000-01-01

    Lawrence Livermore National Laboratory (LLNL) is redesigning its Computer-Based Training (CBT) program for radiological workers. The redesign represents a major effort to produce a single, highly interactive and flexible CBT program that will meet the training needs of a wide range of radiological workers--from researchers and x-ray operators to individuals working in tritium, uranium, plutonium, and accelerator facilities. The new CBT program addresses the broad diversity of backgrounds found at a national laboratory. When a training audience is homogeneous in terms of education level and type of work performed, it is difficult to duplicate the effectiveness of a flexible, technically competent instructor who can tailor a course to the express needs and concerns of a course's participants. Unfortunately, such homogeneity is rare. At LLNL, they have a diverse workforce engaged in a wide range of radiological activities, from the fairly common to the quite exotic. As a result, the Laboratory must offer a wide variety of radiological worker courses. These include a general contamination-control course in addition to radioactive-material-handling courses for both low-level laboratory (i.e., bench-top) activities as well as high-level work in tritium, uranium, and plutonium facilities. They also offer training courses for employees who work with radiation-generating devices--x-ray, accelerator, and E-beam operators, for instance. However, even with the number and variety of courses the Laboratory offers, they are constrained by the diversity of backgrounds (i.e., knowledge and experience) of those to be trained. Moreover, time constraints often preclude in-depth coverage of site- and/or task-specific details. In response to this situation, several years ago LLNL began moving toward computer-based training for radiological workers. Today, that CBT effort includes a general radiological safety course developed by the Department of Energy's Hanford facility and a

  19. Incorporating electronic-based and computer-based strategies: graduate nursing courses in administration.

    Science.gov (United States)

    Graveley, E; Fullerton, J T

    1998-04-01

    The use of electronic technology allows faculty to improve their course offerings. Four graduate courses in nursing administration were contemporized to incorporate fundamental computer-based skills that would be expected of graduates in the work setting. Principles of adult learning offered a philosophical foundation that guided course development and revision. Course delivery strategies included computer-assisted instructional modules, e-mail interactive discussion groups, and use of the electronic classroom. Classroom seminar discussions and two-way interactive video conferencing focused on group resolution of problems derived from employment settings and assigned readings. Using these electronic technologies, a variety of courses can be revised to accommodate the learners' needs.

  20. Memristor-based nanoelectronic computing circuits and architectures

    CERN Document Server

    Vourkas, Ioannis

    2016-01-01

    This book considers the design and development of nanoelectronic computing circuits, systems and architectures focusing particularly on memristors, which represent one of today’s latest technology breakthroughs in nanoelectronics. The book studies, explores, and addresses the related challenges and proposes solutions for the smooth transition from conventional circuit technologies to emerging computing memristive nanotechnologies. Its content spans from fundamental device modeling to emerging storage system architectures and novel circuit design methodologies, targeting advanced non-conventional analog/digital massively parallel computational structures. Several new results on memristor modeling, memristive interconnections, logic circuit design, memory circuit architectures, computer arithmetic systems, simulation software tools, and applications of memristors in computing are presented. High-density memristive data storage combined with memristive circuit-design paradigms and computational tools applied t...

  1. Population-level consequences of spatially heterogeneous exposure to heavy metals in soil: An individual-based model of springtails

    DEFF Research Database (Denmark)

    Meli, Mattia; Auclerc, Apolline; Palmqvist, Annemette

    2013-01-01

    Contamination of soil with toxic heavy metals poses a major threat to the environment and human health. Anthropogenic sources include smelting of ores, municipal wastes, fertilizers, and pesticides. In assessing soil quality and the environmental and ecological risk of contamination with heavy...... metals, often homogeneous contamination of the soil is assumed. However, soils are very heterogeneous environments. Consequently, both contamination and the response of soil organisms can be assumed to be heterogeneous. This might have consequences for the exposure of soil organisms...

  2. The consequences of "Culture's consequences"

    DEFF Research Database (Denmark)

    Knudsen, Fabienne; Froholdt, Lisa Loloma

    2009-01-01

      In this article, it is claimed that research on cross-cultural crews is dominated by one specific understanding of the concept of culture, which is static, evenly distributed and context-independent. Such a conception of culture may bring some basic order while facing an unknown culture...... review of the theory of Geert Hofstede, the most renowned representative of this theoretical approach. The practical consequences of using such a concept of culture is then analysed by means of a critical review of an article applying Hofstede to cross-cultural crews in seafaring. Finally, alternative...... views on culture are presented. The aim of the article is, rather than to promote any specific theory, to reflect about diverse perspectives of cultural sense-making in cross-cultural encounters. Udgivelsesdato: Oktober...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  5. Acoustic and Perceptual Effects of Left-Right Laryngeal Asymmetries Based on Computational Modeling

    Science.gov (United States)

    Samlan, Robin A.; Story, Brad H.; Lotto, Andrew J.; Bunton, Kate

    2014-01-01

    Purpose: Computational modeling was used to examine the consequences of 5 different laryngeal asymmetries on acoustic and perceptual measures of vocal function. Method: A kinematic vocal fold model was used to impose 5 laryngeal asymmetries: adduction, edge bulging, nodal point ratio, amplitude of vibration, and starting phase. Thirty /a/ and /?/…

  6. Electro-encephalogram based brain-computer interface: improved performance by mental practice and concentration skills.

    Science.gov (United States)

    Mahmoudi, Babak; Erfanian, Abbas

    2006-11-01

    Mental imagination is the essential part of the most EEG-based communication systems. Thus, the quality of mental rehearsal, the degree of imagined effort, and mind controllability should have a major effect on the performance of electro-encephalogram (EEG) based brain-computer interface (BCI). It is now well established that mental practice using motor imagery improves motor skills. The effects of mental practice on motor skill learning are the result of practice on central motor programming. According to this view, it seems logical that mental practice should modify the neuronal activity in the primary sensorimotor areas and consequently change the performance of EEG-based BCI. For developing a practical BCI system, recognizing the resting state with eyes opened and the imagined voluntary movement is important. For this purpose, the mind should be able to focus on a single goal for a period of time, without deviation to another context. In this work, we are going to examine the role of mental practice and concentration skills on the EEG control during imaginative hand movements. The results show that the mental practice and concentration can generally improve the classification accuracy of the EEG patterns. It is found that mental training has a significant effect on the classification accuracy over the primary motor cortex and frontal area.

  7. Trajectory Evaluation of Rotor-Flying Robots Using Accurate Inverse Computation Based on Algorithm Differentiation

    Directory of Open Access Journals (Sweden)

    Yuqing He

    2014-01-01

    Full Text Available Autonomous maneuvering flight control of rotor-flying robots (RFR is a challenging problem due to the highly complicated structure of its model and significant uncertainties regarding many aspects of the field. As a consequence, it is difficult in many cases to decide whether or not a flight maneuver trajectory is feasible. It is necessary to conduct an analysis of the flight maneuvering ability of an RFR prior to test flight. Our aim in this paper is to use a numerical method called algorithm differentiation (AD to solve this problem. The basic idea is to compute the internal state (i.e., attitude angles and angular rates and input profiles based on predetermined maneuvering trajectory information denoted by the outputs (i.e., positions and yaw angle and their higher-order derivatives. For this purpose, we first present a model of the RFR system and show that it is flat. We then cast the procedure for obtaining the required state/input based on the desired outputs as a static optimization problem, which is solved using AD and a derivative based optimization algorithm. Finally, we test our proposed method using a flight maneuver trajectory to verify its performance.

  8. Students Perception on the Use of Computer Based Test

    Science.gov (United States)

    Nugroho, R. A.; Kusumawati, N. S.; Ambarwati, O. C.

    2018-02-01

    Teaching nowadays might use technology in order to disseminate science and knowledge. As part of teaching, the way evaluating study progress and result has also benefited from this IT rapid progress. The computer-based test (CBT) has been introduced to replace the more conventional Paper and Pencil Test (PPT). CBT are considered more advantageous than PPT. It is considered as more efficient, transparent, and has the ability of minimising fraud in cognitive evaluation. Current studies have indicated the debate of CBT vs PPT usage. Most of the current research compares the two methods without exploring the students’ perception about the test. This study will fill the gap in the literature by providing students’ perception on the two tests method. Survey approach is conducted to obtain the data. The sample is collected in two identical classes with similar subject in a public university in Indonesia. Mann-Whitney U test used to analyse the data. The result indicates that there is a significant difference between two groups of students regarding CBT usage. Student with different test method prefers to have test other than what they were having. Further discussion and research implication is discussed in the paper.

  9. Flow-based model of computer hackers' motivation.

    Science.gov (United States)

    Voiskounsky, Alexander E; Smyslova, Olga V

    2003-04-01

    Hackers' psychology, widely discussed in the media, is almost entirely unexplored by psychologists. In this study, hackers' motivation is investigated, using the flow paradigm. Flow is likely to motivate hackers, according to views expressed by researchers and by hackers themselves. Taken as granted that hackers experience flow, it was hypothesized that flow increases with the increase of hackers' competence in IT use. Self-selected subjects were recruited on specialized web sources; 457 hackers filled out a web questionnaire. Competence in IT use, specific flow experience, and demographic data were questioned. An on-line research was administered within the Russian-speaking community (though one third of Ss are non-residents of Russian Federation); since hacking seems to be international, the belief is expressed that the results are universal. The hypothesis is not confirmed: flow motivation characterizes the least and the most competent hackers, and the members of an intermediate group, that is, averagely competent Ss report the "flow crisis"-no (or less) flow experience. Two differing strategies of task choice were self-reported by Ss: a step-by-step increase of the difficulty of choices leads to a match of challenges and skills (and to preserving the flow experience); putting choices irrespective of the likelihood of solution leads to a "flow crisis." The findings give productive hints on processes of hackers' motivational development. The flow-based model of computer hackers' motivation was developed. It combines both empirically confirmed and theoretically possible ways of hackers' "professional" growth.

  10. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  11. Brain-computer interface based on generation of visual images.

    Directory of Open Access Journals (Sweden)

    Pavel Bobrov

    Full Text Available This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP classifier.

  12. Computer-based mechanical design of overhead lines

    Science.gov (United States)

    Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.

    2016-02-01

    Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface

  13. Computer based core monitoring system for an operating CANDU reactor

    International Nuclear Information System (INIS)

    Yoon, Moon Young; Kwon, O Hwan; Kim, Kyung Hwa; Yeom, Choong Sub

    2004-01-01

    The research was performed to develop a CANDU-6 Core Monitoring System(CCMS) that enables operators to have efficient core management by monitoring core power distribution, burnup distribution, and the other important core variables and managing the past core history for Wolsong nuclear power plant unit 1. The CCMS uses Reactor Fueling Simulation Program(RFSP, developed by AECL) for continuous core calculation by integrating the algorithm and assumptions validated and uses the information taken from Digital Control Computer(DCC) for the purpose of producing basic input data. The CCMS has two modules; CCMS server program and CCMS client program. The CCMS server program performs automatic and continuous core calculation and manages overall output controlled by DataBase Management System. The CCMS client program enables users to monitor current and past core status in the predefined GUI(Graphic-User Interface) environment. For the purpose of verifying the effectiveness of CCMS, we compared field-test data with the data used for Wolsong unit 1 operation. In the verification the mean percent differences of both cases were the same(0.008%), which showed that the CCMS could monitor core behaviors well

  14. Computational modeling of a carbon nanotube-based DNA nanosensor

    Energy Technology Data Exchange (ETDEWEB)

    Kalantari-Nejad, R; Bahrami, M [Mechanical Engineering Department, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Rafii-Tabar, H [Department of Medical Physics and Biomedical Engineering and Research Centre for Medical Nanotechnology and Tissue Engineering, Shahid Beheshti University of Medical Sciences, Evin, Tehran (Iran, Islamic Republic of); Rungger, I; Sanvito, S, E-mail: mbahrami@aut.ac.ir [School of Physics and CRANN, Trinity College, Dublin 2 (Ireland)

    2010-11-05

    During the last decade the design of biosensors, based on quantum transport in one-dimensional nanostructures, has developed as an active area of research. Here we investigate the sensing capabilities of a DNA nanosensor, designed as a semiconductor single walled carbon nanotube (SWCNT) connected to two gold electrodes and functionalized with a DNA strand acting as a bio-receptor probe. In particular, we have considered both covalent and non-covalent bonding between the DNA probe and the SWCNT. The optimized atomic structure of the sensor is computed both before and after the receptor attaches itself to the target, which consists of another DNA strand. The sensor's electrical conductance and transmission coefficients are calculated at the equilibrium geometries via the non-equilibrium Green's function scheme combined with the density functional theory in the linear response limit. We demonstrate a sensing efficiency of 70% for the covalently bonded bio-receptor probe, which drops to about 19% for the non-covalently bonded one. These results suggest that a SWCNT may be a promising candidate for a bio-molecular FET sensor.

  15. Computational modeling of a carbon nanotube-based DNA nanosensor

    International Nuclear Information System (INIS)

    Kalantari-Nejad, R; Bahrami, M; Rafii-Tabar, H; Rungger, I; Sanvito, S

    2010-01-01

    During the last decade the design of biosensors, based on quantum transport in one-dimensional nanostructures, has developed as an active area of research. Here we investigate the sensing capabilities of a DNA nanosensor, designed as a semiconductor single walled carbon nanotube (SWCNT) connected to two gold electrodes and functionalized with a DNA strand acting as a bio-receptor probe. In particular, we have considered both covalent and non-covalent bonding between the DNA probe and the SWCNT. The optimized atomic structure of the sensor is computed both before and after the receptor attaches itself to the target, which consists of another DNA strand. The sensor's electrical conductance and transmission coefficients are calculated at the equilibrium geometries via the non-equilibrium Green's function scheme combined with the density functional theory in the linear response limit. We demonstrate a sensing efficiency of 70% for the covalently bonded bio-receptor probe, which drops to about 19% for the non-covalently bonded one. These results suggest that a SWCNT may be a promising candidate for a bio-molecular FET sensor.

  16. The transesophageal echocardiography simulator based on computed tomography images.

    Science.gov (United States)

    Piórkowski, Adam; Kempny, Aleksander

    2013-02-01

    Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

  17. Implementing Computer-Based Procedures: Thinking Outside the Paper Margins

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; Bly, Aaron

    2017-06-01

    In the past year there has been increased interest from the nuclear industry in adopting the use of electronic work packages and computer-based procedures (CBPs) in the field. The goal is to incorporate the use of technology in order to meet the Nuclear Promise requirements of reducing costs and improve efficiency and decrease human error rates of plant operations. Researchers, together with the nuclear industry, have been investigating the benefits an electronic work package system and specifically CBPs would have over current paper-based procedure practices. There are several classifications of CBPs ranging from a straight copy of the paper-based procedure in PDF format to a more intelligent dynamic CBP. A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools (e.g., placekeeping and correct component verification), and dynamic step presentation. The latter means that the CBP system could only display relevant steps based on operating mode, plant status, and the task at hand. The improvements can lead to reduction of the worker’s workload and human error by allowing the work to focus on the task at hand more. A team of human factors researchers at the Idaho National Laboratory studied and developed design concepts for CBPs for field workers between 2012 and 2016. The focus of the research was to present information in a procedure in a manner that leveraged the dynamic and computational capabilities of a handheld device allowing the worker to focus more on the task at hand than on the administrative processes currently applied when conducting work in the plant. As a part of the research the team identified type of work, instructions, and scenarios where the transition to a dynamic CBP system might not be as beneficial as it would for other types of work in the plant. In most cases the decision to use a dynamic CBP system and utilize the dynamic capabilities gained will be beneficial to the worker

  18. Consequences of increased use of computed tomography imaging for trauma patients in rural referring hospitals prior to transfer to a regional trauma centre.

    Science.gov (United States)

    Berkseth, Timothy J; Mathiason, Michelle A; Jafari, Mary Ellen; Cogbill, Thomas H; Patel, Nirav Y

    2014-05-01

    Computed tomography (CT) plays an integral role in the evaluation and management of trauma patients. As the number of referring hospital (RH)-based CT scanners increased, so has their utilization in trauma patients before transfer. We hypothesized that this has resulted in increased time at RH, image duplication, and radiation dose. A retrospective chart review was completed for trauma activations transferred to an ACS-verified Level II Trauma Centre (TC) during two time periods: 2002-2004 (Group 1) and 2006-2008 (Group 2). 2005 data were excluded as this marked the transition period for acquisition of hospital-based CT scanners in RH. Statistical analysis included t test and χ(2) analysis. Pgroup 1 and 514 in group 2. Mean age was greater in group 2 compared to group 1 (40.3 versus 37.4, respectively; P=0.028). There were 115 patients in group 1 versus 202 patients in group 2 who underwent CT imaging at RH (Pgroup 1 had CT scans performed at the TC versus 258 patients in group 2 (Ptime at the RH was similar between the groups (117.1 and 112.3min for group 1 and 2, respectively; P=0.561). However, when comparing patients with and without a pretransfer CT at the RH, the median time at RH was 140 versus 67min, respectively (Pgroup 1 and n=42 in group 2) was not significantly different between the two time periods (P=0.392). Head CTs comprised the majority of duplicate CT imaging in both time periods (82.4% in group 1 and 90.5% in group 2). Mean total estimated radiation dose per patient was not significantly different between the two groups (group 1=8.4mSv versus group 2=7.8mSv; P=0.192). A significant increase in CT imaging at the RH prior to transfer to the TC was observed over the study periods. No associated increases in mean time at the RH, image duplication at TC, total estimated radiation dose per patient, and mortality rate were observed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Prior-based artifact correction (PBAC) in computed tomography

    International Nuclear Information System (INIS)

    Heußer, Thorsten; Brehm, Marcus; Ritschl, Ludwig; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form of a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data

  20. Agent-Based Computational Modeling of Cell Culture ...

    Science.gov (United States)

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assumed a “fried egg shape” but became increasingly cuboidal with increasing confluency. The surface area presented by each cell to the overlying medium varies from cell-to-cell and is a determinant of diffusional flux of toxicant from the medium into the cell. Thus, dose varies among cells for a given concentration of toxicant in the medium. Computer code describing diffusion of H2O2 from medium into each cell and clearance of H2O2 was calibrated against H2O2 time-course data (25, 50, or 75 uM H2O2 for 60 min) obtained with the Amplex Red assay for the medium and the H2O2-sensitive fluorescent reporter, HyPer, for cytosol. Cellular H2O2 concentrations peaked at about 5 min and were near baseline by 10 min. The model predicted a skewed distribution of surface areas, with between cell variation usually 2 fold or less. Predicted variability in cellular dose was in rough agreement with the variation in the HyPer data. These results are preliminary, as the model was not calibrated to the morphology of a specific cell type. Future work will involve morphology model calibration against human bronchial epithelial (BEAS-2B) cells. Our results show, however, the potential of agent-based modeling