WorldWideScience

Sample records for computer based measure

  1. Blind topological measurement-based quantum computation.

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  2. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  3. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  4. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  5. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  6. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  7. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  8. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  9. Computer Vision Based Measurement of Wildfire Smoke Dynamics

    Directory of Open Access Journals (Sweden)

    BUGARIC, M.

    2015-02-01

    Full Text Available This article presents a novel method for measurement of wildfire smoke dynamics based on computer vision and augmented reality techniques. The aspect of smoke dynamics is an important feature in video smoke detection that could distinguish smoke from visually similar phenomena. However, most of the existing smoke detection systems are not capable of measuring the real-world size of the detected smoke regions. Using computer vision and GIS-based augmented reality, we measure the real dimensions of smoke plumes, and observe the change in size over time. The measurements are performed on offline video data with known camera parameters and location. The observed data is analyzed in order to create a classifier that could be used to eliminate certain categories of false alarms induced by phenomena with different dynamics than smoke. We carried out an offline evaluation where we measured the improvement in the detection process achieved using the proposed smoke dynamics characteristics. The results show a significant increase in algorithm performance, especially in terms of reducing false alarms rate. From this it follows that the proposed method for measurement of smoke dynamics could be used to improve existing smoke detection algorithms, or taken into account when designing new ones.

  10. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  11. Generalized flow and determinism in measurement-based quantum computation

    Energy Technology Data Exchange (ETDEWEB)

    Browne, Daniel E [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Kashefi, Elham [Computing Laboratory and Christ Church College, University of Oxford, Parks Road, Oxford OX1 3QD (United Kingdom); Mhalla, Mehdi [Laboratoire d' Informatique de Grenoble, CNRS - Centre national de la recherche scientifique, Universite de Grenoble (France); Perdrix, Simon [Preuves, Programmes et Systemes (PPS), Universite Paris Diderot, Paris (France)

    2007-08-15

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model.

  12. Generalized flow and determinism in measurement-based quantum computation

    International Nuclear Information System (INIS)

    Browne, Daniel E; Kashefi, Elham; Mhalla, Mehdi; Perdrix, Simon

    2007-01-01

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model

  13. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  14. Automatic calibration system of the temperature instrument display based on computer vision measuring

    Science.gov (United States)

    Li, Zhihong; Li, Jinze; Bao, Changchun; Hou, Guifeng; Liu, Chunxia; Cheng, Fang; Xiao, Nianxin

    2010-07-01

    With the development of computers and the techniques of dealing with pictures and computer optical measurement, various measuring techniques are maturing gradually on the basis of optical picture processing technique and using in practice. On the bases, we make use of the many years' experience and social needs in temperature measurement and computer vision measurement to come up with the completely automatic way of the temperature measurement meter with integration of the computer vision measuring technique. It realizes synchronization collection with theory temperature value, improves calibration efficiency. based on least square fitting principle, integrate data procession and the best optimize theory, rapidly and accurately realizes automation acquisition and calibration of temperature.

  15. Radiologic total lung capacity measurement. Development and evaluation of a computer-based system

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Mazzeo, J.; Borgstrom, M.; Hunter, T.B.; Newell, J.D.; Bjelland, J.C.

    1986-11-01

    The development of a computer-based radiologic total lung capacity (TLC) measurement system designed to be used by non-physician personnel is detailed. Four operators tested the reliability and validity of the system by measuring inspiratory PA and lateral pediatric chest radiographs with a Graf spark pen interfaced to a DEC VAX 11/780 computer. First results suggest that the ultimate goal of developing an accurate and easy to use TLC measurement system for non-physician personnel is attainable.

  16. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  17. Lung nodule assessment in computed tomography. Precision of attenuation measurement based on computer-aided volumetry

    International Nuclear Information System (INIS)

    Knoess, Naomi; Hoffmann, B.; Fabel, M.; Wiese, C.; Bolte, H.; Heller, M.; Biederer, J.; Jochens, A.

    2009-01-01

    Purpose: to compare the reproducibility (r) of CT value measurement of pulmonary nodules using volumetry software (LungCare, LC) and manual ROIs (mROI). Materials and methods: 54 artificial nodules in a chest phantom were scanned three times with CT. CT values were measured with LC and mROI. The intrascan-r was assessed with three measurements in the first scan, and the interscan-r with measurements in three consecutive scans (one observer). Intrascan-r und interobserver-r (two obs.) were assessed in the first scan and in contrast-enhanced CT of 51 nodules from 15 patients (kernels b50f and b80f). Intrascan-r and interscan-r were described as the mean range and interobserver-r as the mean difference of CT values. The significance of differences was tested using t-test and sign test. Results: reproducibility was significantly higher for volumetry-based measurements in both artificial and patient nodules (range 0.11 vs. 6.16 HU for intrascan-r, 2.22 vs. 7.03 HU for interscan-r, difference 0.11 vs. 18.42 HU for interobserver-r; patients: 1.78 vs. 13.19 HU (b50f-Kernel) and 1.88 vs. 27.4 HU (b80f-Kernel) for intrascan-r, 3.71 vs. 22.43 HU for interobserver-r). Absolute CT values differed significantly between convolution kernels (pat./mROI: 29.3 [b50f] and 151.9 HU [b80f] pat./LC: 5 [b50f] and 147 HU [b80f]). Conclusion: the reproducibility of volumetry-based measurements of CT values in pulmonary nodules is significantly higher and should therefore be recommended, e.g. in dynamic chest CT protocols. Reproducibility does not depend on absolute CT values. (orig.)

  18. Computer based methods for measurement of joint space width: update of an ongoing OMERACT project

    NARCIS (Netherlands)

    Sharp, John T.; Angwin, Jane; Boers, Maarten; Duryea, Jeff; von Ingersleben, Gabriele; Hall, James R.; Kauffman, Joost A.; Landewé, Robert; Langs, Georg; Lukas, Cédric; Maillefert, Jean-Francis; Bernelot Moens, Hein J.; Peloschek, Philipp; Strand, Vibeke; van der Heijde, Désirée

    2007-01-01

    Computer-based methods of measuring joint space width (JSW) could potentially have advantages over scoring joint space narrowing, with regard to increased standardization, sensitivity, and reproducibility. In an early exercise, 4 different methods showed good agreement on measured change in JSW over

  19. Computer-based measurement and automatizatio aplication research in nuclear technology fields

    International Nuclear Information System (INIS)

    Jiang Hongfei; Zhang Xiangyang

    2003-01-01

    This paper introduces computer-based measurement and automatization application research in nuclear technology fields. The emphasis of narration are the role of software in the development of system, and the network measurement and control software model which has optimistic application foreground. And presents the application examples of research and development. (authors)

  20. Computing derivative-based global sensitivity measures using polynomial chaos expansions

    International Nuclear Information System (INIS)

    Sudret, B.; Mai, C.V.

    2015-01-01

    In the field of computer experiments sensitivity analysis aims at quantifying the relative importance of each input parameter (or combinations thereof) of a computational model with respect to the model output uncertainty. Variance decomposition methods leading to the well-known Sobol' indices are recognized as accurate techniques, at a rather high computational cost though. The use of polynomial chaos expansions (PCE) to compute Sobol' indices has allowed to alleviate the computational burden though. However, when dealing with large dimensional input vectors, it is good practice to first use screening methods in order to discard unimportant variables. The derivative-based global sensitivity measures (DGSMs) have been developed recently in this respect. In this paper we show how polynomial chaos expansions may be used to compute analytically DGSMs as a mere post-processing. This requires the analytical derivation of derivatives of the orthonormal polynomials which enter PC expansions. Closed-form expressions for Hermite, Legendre and Laguerre polynomial expansions are given. The efficiency of the approach is illustrated on two well-known benchmark problems in sensitivity analysis. - Highlights: • Derivative-based global sensitivity measures (DGSM) have been developed for screening purpose. • Polynomial chaos expansions (PC) are used as a surrogate model of the original computational model. • From a PC expansion the DGSM can be computed analytically. • The paper provides the derivatives of Hermite, Legendre and Laguerre polynomials for this purpose

  1. Impedance computations and beam-based measurements: A problem of discrepancy

    Science.gov (United States)

    Smaluk, Victor

    2018-04-01

    High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.

  2. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    Science.gov (United States)

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  3. Computer based system for measuring the minority carrier lifetime in the solar cells

    International Nuclear Information System (INIS)

    Morales A, A.; Casados C, G.

    1994-01-01

    We show the development of a computer based system for measuring the minority carrier lifetime in the base of silicon solar cells. The system allows using two different techniques for such kind of measurements:the open circuit voltage decay (OCVD) and the surface voltage decay SVD. The equipment is based on internal cards for IBM-Pc or compatible computers that work as an oscilloscope and as a function generator, in addition to a synchronization and signal conditioning circuit. The system is fully controlled by a 'c' language program that optimizes the used of the instrument built in this way, and makes the analysis of the measurement data by curve fitting techniques. We show typical results obtained with silicon solar cells made in our laboratories. (Author)

  4. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  5. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  6. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  7. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    Science.gov (United States)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  8. Tundish Cover Flux Thickness Measurement Method and Instrumentation Based on Computer Vision in Continuous Casting Tundish

    Directory of Open Access Journals (Sweden)

    Meng Lu

    2013-01-01

    Full Text Available Thickness of tundish cover flux (TCF plays an important role in continuous casting (CC steelmaking process. Traditional measurement method of TCF thickness is single/double wire methods, which have several problems such as personal security, easily affected by operators, and poor repeatability. To solve all these problems, in this paper, we specifically designed and built an instrumentation and presented a novel method to measure the TCF thickness. The instrumentation was composed of a measurement bar, a mechanical device, a high-definition industrial camera, a Siemens S7-200 programmable logic controller (PLC, and a computer. Our measurement method was based on the computer vision algorithms, including image denoising method, monocular range measurement method, scale invariant feature transform (SIFT, and image gray gradient detection method. Using the present instrumentation and method, images in the CC tundish can be collected by camera and transferred to computer to do imaging processing. Experiments showed that our instrumentation and method worked well at scene of steel plants, can accurately measure the thickness of TCF, and overcome the disadvantages of traditional measurement methods, or even replace the traditional ones.

  9. Did Installed Base Given an Incumbent Any (Measurable) Advantages in Federal Computer Procurement?

    OpenAIRE

    Shane M. Greenstein

    1993-01-01

    This research examines the relative strength and significance of the status of "incumbent contractor" in federal computer procurement. One finding, as expected, is that an agency is likely to acquire a system from an incumbent vendor. Another finding, perhaps more interesting, is that the (in)compatibility between a buyer's installed base and a potential system also influences the vendor choice; a result that may be the first econometric measurement of the competitive effects of incompatibili...

  10. Relative resilience to noise of standard and sequential approaches to measurement-based quantum computation

    Science.gov (United States)

    Gallagher, C. B.; Ferraro, A.

    2018-05-01

    A possible alternative to the standard model of measurement-based quantum computation (MBQC) is offered by the sequential model of MBQC—a particular class of quantum computation via ancillae. Although these two models are equivalent under ideal conditions, their relative resilience to noise in practical conditions is not yet known. We analyze this relationship for various noise models in the ancilla preparation and in the entangling-gate implementation. The comparison of the two models is performed utilizing both the gate infidelity and the diamond distance as figures of merit. Our results show that in the majority of instances the sequential model outperforms the standard one in regard to a universal set of operations for quantum computation. Further investigation is made into the performance of sequential MBQC in experimental scenarios, thus setting benchmarks for possible cavity-QED implementations.

  11. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    Science.gov (United States)

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Statistical length of DNA based on AFM image measured by a computer

    International Nuclear Information System (INIS)

    Chen Xinqing; Qiu Xijun; Zhang Yi; Hu Jun; Wu Shiying; Huang Yibo; Ai Xiaobai; Li Minqian

    2001-01-01

    Taking advantage of image processing technology, the contour length of DNA molecule was measured automatically by a computer. Based on the AFM image of DNA, the topography of DNA was simulated into a curve. Then the DNA length was measured automatically by inserting mode. It was shown that the experimental length of a naturally deposited DNA (180.4 +- 16.4 nm) was well consistent with the theoretical length (185.0 nm). Comparing to other methods, the present approach had advantages of precision and automatism. The stretched DNA was also measured. It present approach had advantages of precision and automatism. The stretched DNA was also measured. It was shown that the experimental length (343.6 +- 20.7 nm) was much longer than the theoretical length (307.0 nm). This result indicated that the stretching process had a distinct effect on the DNA length. However, the method provided here avoided the DNA-stretching effect

  13. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  14. A measurement-based X-ray source model characterization for CT dosimetry computations.

    Science.gov (United States)

    Sommerville, Mitchell; Poirier, Yannick; Tambasco, Mauro

    2015-11-08

    within the experimental uncertainties associated with measurement reproducibility and chamber volume effects for the PMMA phantom. The agreement between calculation and measurement was within experimental uncertainty for 19 out of 20 simulation conditions at five points of interest in the anthropomorphic thorax phantom for the four beam energies modeled. The source model and characterization technique based on HVL measurements and nominal kVp can be used to accurately compute CT dose. This accuracy provides experimental validation of kVDoseCalc for computing CT dose.

  15. Programming Non-Trivial Algorithms in the Measurement Based Quantum Computation Model

    Energy Technology Data Exchange (ETDEWEB)

    Alsing, Paul [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Fanto, Michael [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Lott, Capt. Gordon [United States Air Force Research Laboratory, Wright-Patterson Air Force Base; Tison, Christoper C. [United States Air Force Research Laboratory, Wright-Patterson Air Force Base

    2014-01-01

    We provide a set of prescriptions for implementing a quantum circuit model algorithm as measurement based quantum computing (MBQC) algorithm1, 2 via a large cluster state. As means of illustration we draw upon our numerical modeling experience to describe a large graph state capable of searching a logical 8 element list (a non-trivial version of Grover's algorithm3 with feedforward). We develop several prescriptions based on analytic evaluation of cluster states and graph state equations which can be generalized into any circuit model operations. Such a resulting cluster state will be able to carry out the desired operation with appropriate measurements and feed forward error correction. We also discuss the physical implementation and the analysis of the principal 3-qubit entangling gate (Toffoli) required for a non-trivial feedforward realization of an 8-element Grover search algorithm.

  16. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  17. Simple area-based measurement for multidetector computed tomography to predict left ventricular size

    International Nuclear Information System (INIS)

    Schlett, Christopher L.; Kwait, Dylan C.; Mahabadi, Amir A.; Hoffmann, Udo; Bamberg, Fabian; O'Donnell, Christopher J.; Fox, Caroline S.

    2010-01-01

    Measures of left ventricular (LV) mass and dimensions are independent predictors of morbidity and mortality. We determined whether an axial area-based method by computed tomography (CT) provides an accurate estimate of LV mass and volume. A total of 45 subjects (49% female, 56.0 ± 12 years) with a wide range of LV geometry underwent contrast-enhanced 64-slice CT. LV mass and volume were derived from 3D data. 2D images were analysed to determine LV area, the direct transverse cardiac diameter (dTCD) and the cardiothoracic ratio (CTR). Furthermore, feasibility was confirmed in 100 Framingham Offspring Cohort subjects. 2D measures of LV area, dTCD and CTR were 47.3 ± 8 cm 2 , 14.7 ± 1.5 cm and 0.54 ± 0.05, respectively. 3D-derived LV volume (end-diastolic) and mass were 148.9 ± 45 cm 3 and 124.2 ± 34 g, respectively. Excellent inter- and intra-observer agreement were shown for 2D LV area measurements (both intraclass correlation coefficients (ICC) = 0.99, p 0.27). Compared with traditionally used CTR, LV size can be accurately predicted based on a simple and highly reproducible axial LV area-based measurement. (orig.)

  18. Accuracy of volumetric measurement of simulated root resorption lacunas based on cone beam computed tomography.

    Science.gov (United States)

    Wang, Y; He, S; Guo, Y; Wang, S; Chen, S

    2013-08-01

    To evaluate the accuracy of volumetric measurement of simulated root resorption cavities based on cone beam computed tomography (CBCT), in comparison with that of Micro-computed tomography (Micro-CT) which served as the reference. The State Key Laboratory of Oral Diseases at Sichuan University. Thirty-two bovine teeth were included for standardized CBCT scanning and Micro-CT scanning before and after the simulation of different degrees of root resorption. The teeth were divided into three groups according to the depths of the root resorption cavity (group 1: 0.15, 0.2, 0.3 mm; group 2: 0.6, 1.0 mm; group 3: 1.5, 2.0, 3.0 mm). Each depth included four specimens. Differences in tooth volume before and after simulated root resorption were then calculated from CBCT and Micro-CT scans, respectively. The overall between-method agreement of the measurements was evaluated using the concordance correlation coefficient (CCC). For the first group, the average volume of resorption cavity was 1.07 mm(3) , and the between-method agreement of measurement for the volume changes was low (CCC = 0.098). For the second and third groups, the average volumes of resorption cavities were 3.47 and 6.73 mm(3) respectively, and the between-method agreements were good (CCC = 0.828 and 0.895, respectively). The accuracy of 3-D quantitative volumetric measurement of simulated root resorption based on CBCT was fairly good in detecting simulated resorption cavities larger than 3.47 mm(3), while it was not sufficient for measuring resorption cavities smaller than 1.07 mm(3) . This method could be applied in future studies of root resorption although further studies are required to improve its accuracy. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  20. Glossiness of Colored Papers based on Computer Graphics Model and Its Measuring Method

    Science.gov (United States)

    Aida, Teizo

    In the case of colored papers, the color of surface effects strongly upon the gloss of its paper. The new glossiness for such a colored paper is suggested in this paper. First, using the Achromatic and Chromatic Munsell colored chips, the author obtained experimental equation which represents the relation between lightness V ( or V and saturation C ) and psychological glossiness Gph of these chips. Then, the author defined a new glossiness G for the colored papers, based on the above mentioned experimental equations Gph and Cook-Torrance's reflection model which are widely used in the filed of Computer Graphics. This new glossiness is shown to be nearly proportional to the psychological glossiness Gph. The measuring system for the new glossiness G is furthermore descrived. The measuring time for one specimen is within 1 minute.

  1. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H.; Broersen, Alexander

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model- based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) im- ages on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods...

  2. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Science.gov (United States)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  3. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  4. A dental implant-based registration method for measuring mandibular kinematics using cone beam computed tomography-based fluoroscopy.

    Science.gov (United States)

    Lin, Cheng-Chung; Chen, Chien-Chih; Chen, Yunn-Jy; Lu, Tung-Wu; Hong, Shih-Wun

    2014-01-01

    This study aimed to develop and evaluate experimentally an implant-based registration method for measuring three-dimensional (3D) kinematics of the mandible and dental implants in the mandible based on dental cone beam computed tomography (CBCT), modified to include fluoroscopic function. The proposed implant-based registration method was based on the registration of CBCT data of implants/bones with single-plane fluoroscopy images. Seven registration conditions that included one to three implants were evaluated experimentally for their performance in a cadaveric porcine headmodel. The implant-based registration method was shown to have measurement errors (SD) of less than -0.2 (0.3) mm, 1.1 (2.2) mm, and 0.7 degrees (1.3 degrees) for the in-plane translation, out-of-plane translation, and all angular components, respectively, regardless of the number of implants used. The corresponding errors were reduced to less than -0.1 (0.1) mm, -0.3 (1.7) mm, and 0.5 degree (0.4 degree) when three implants were used. An implant-based registration method was developed to measure the 3D kinematics of the mandible/implants. With its high accuracy and reliability, the new method will be useful for measuring the 3D motion of the bones/implants for relevant applications.

  5. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    Science.gov (United States)

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  6. Measuring scientific reasoning through behavioral analysis in a computer-based problem solving exercise

    Science.gov (United States)

    Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2016-12-01

    Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new

  7. A direct approach to fault-tolerance in measurement-based quantum computation via teleportation

    International Nuclear Information System (INIS)

    Silva, Marcus; Danos, Vincent; Kashefi, Elham; Ollivier, Harold

    2007-01-01

    We discuss a simple variant of the one-way quantum computing model (Raussendorf R and Briegel H-J 2001 Phys. Rev. Lett. 86 5188), called the Pauli measurement model, where measurements are restricted to be along the eigenbases of the Pauli X and Y operators, while qubits can be initially prepared both in the vertical bar + π/4 > := 1/√2( vertical bar 0> + e i(π/4) vertical bar 1>) state and the usual vertical bar +> := 1/√2 ( vertical bar 0 > + vertical bar 1>) state. We prove the universality of this quantum computation model, and establish a standardization procedure which permits all entanglement and state preparation to be performed at the beginning of computation. This leads us to develop a direct approach to fault-tolerance by simple transformations of the entanglement graph and preparation operations, while error correction is performed naturally via syndrome-extracting teleportations

  8. High level language for measurement complex control based on the computer E-100I

    Science.gov (United States)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  9. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    Science.gov (United States)

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  10. Feature-based analysis for quality assessment of x-ray computed tomography measurements

    International Nuclear Information System (INIS)

    Nardelli, Vitor C; Arenhart, Francisco A; Donatelli, Gustavo D; Porath, Maurício C; Niggemann, Christian; Schmitt, Robert

    2012-01-01

    This paper presents an approach to assess the quality of the data extracted with computed tomography (CT) measuring systems to perform geometrical evaluations. The approach consists in analyzing the error features introduced by the CT measuring system during the extraction operation. The analysis of the features is performed qualitatively (using graphical analysis tools) and/or quantitatively (by means of the root-mean-square deviation parameter of the error features). The approach was used to analyze four sets of measurements performed with an industrial x-ray cone beam CT measuring system. Three test parts were used in the experiments: a high accuracy manufacturing multi-wave standard, a calibrated step cylinder and a calibrated production part. The results demonstrate the usefulness of the approach to gain knowledge on CT measuring processes and improve the quality of CT geometrical evaluations. Advantages and limitations of the approach are discussed. (paper)

  11. Volume Measurement Algorithm for Food Product with Irregular Shape using Computer Vision based on Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-11-01

    Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.

  12. New computer security measures

    CERN Multimedia

    IT Department

    2008-01-01

    As a part of the long-term strategy to improve computer security at CERN, and especially given the attention focused to CERN by the start-up of the LHC, two additional security measures concerning DNS and Tor will shortly be introduced. These are described in the following texts and will affect only a small number of users. "PHISHING" ATTACKS CONTINUE CERN computer users continue to be subjected to attacks by people trying to infect our machines and obtain passwords and other confidential information by social engineering trickery. Recent examples include an e-mail message sent from "La Poste" entitled "Colis Postal" on 21 August, a fake mail sent from web and mail services on 8 September, and an e-mail purporting to come from Hallmark Cards announcing the arrival of an electronic postcard. However, there are many other examples and there are reports of compromised mail accounts being used for more realistic site-specific phishing attempts. Given the increased publicity rela...

  13. Model-based segmentation in orbital volume measurement with cone beam computed tomography and evaluation against current concepts.

    Science.gov (United States)

    Wagner, Maximilian E H; Gellrich, Nils-Claudius; Friese, Karl-Ingo; Becker, Matthias; Wolter, Franz-Erich; Lichtenstein, Juergen T; Stoetzer, Marcus; Rana, Majeed; Essig, Harald

    2016-01-01

    Objective determination of the orbital volume is important in the diagnostic process and in evaluating the efficacy of medical and/or surgical treatment of orbital diseases. Tools designed to measure orbital volume with computed tomography (CT) often cannot be used with cone beam CT (CBCT) because of inferior tissue representation, although CBCT has the benefit of greater availability and lower patient radiation exposure. Therefore, a model-based segmentation technique is presented as a new method for measuring orbital volume and compared to alternative techniques. Both eyes from thirty subjects with no known orbital pathology who had undergone CBCT as a part of routine care were evaluated (n = 60 eyes). Orbital volume was measured with manual, atlas-based, and model-based segmentation methods. Volume measurements, volume determination time, and usability were compared between the three methods. Differences in means were tested for statistical significance using two-tailed Student's t tests. Neither atlas-based (26.63 ± 3.15 mm(3)) nor model-based (26.87 ± 2.99 mm(3)) measurements were significantly different from manual volume measurements (26.65 ± 4.0 mm(3)). However, the time required to determine orbital volume was significantly longer for manual measurements (10.24 ± 1.21 min) than for atlas-based (6.96 ± 2.62 min, p < 0.001) or model-based (5.73 ± 1.12 min, p < 0.001) measurements. All three orbital volume measurement methods examined can accurately measure orbital volume, although atlas-based and model-based methods seem to be more user-friendly and less time-consuming. The new model-based technique achieves fully automated segmentation results, whereas all atlas-based segmentations at least required manipulations to the anterior closing. Additionally, model-based segmentation can provide reliable orbital volume measurements when CT image quality is poor.

  14. A computer aided measurement method for unstable pelvic fractures based on standardized radiographs

    International Nuclear Information System (INIS)

    Zhao, Jing-xin; Zhao, Zhe; Zhang, Li-cheng; Su, Xiu-yun; Du, Hai-long; Zhang, Li-ning; Zhang, Li-hai; Tang, Pei-fu

    2015-01-01

    To set up a method for measuring radiographic displacement of unstable pelvic ring fractures based on standardized X-ray images and then test its reliability and validity using a software-based measurement technique. Twenty-five patients that were diagnosed as AO/OTA type B or C pelvic fractures with unilateral pelvis fractured and dislocated were eligible for inclusion by a review of medical records in our clinical centre. Based on the input pelvic preoperative CT data, the standardized X-ray images, including inlet, outlet, and anterior-posterior (AP) radiographs, were simulated using Armira software (Visage Imaging GmbH, Berlin, Germany). After representative anatomic landmarks were marked on the standardized X-ray images, the 2-dimensional (2D) coordinates of these points could be revealed in Digimizer software (Model: Mitutoyo Corp., Tokyo, Japan). Subsequently, we developed a formula that indicated the translational and rotational displacement patterns of the injured hemipelvis. Five separate observers calculated the displacement outcomes using the established formula and determined the rotational patterns using a 3D-CT model based on their overall impression. We performed 3D reconstruction of all the fractured pelvises using Mimics (Materialise, Haasrode, Belgium) and determined the translational and rotational displacement using 3-matic suite. The interobserver reliability of the new method was assessed by comparing the continuous measure and categorical outcomes using intraclass correlation coefficient (ICC) and kappa statistic, respectively. The interobserver reliability of the new method for translational and rotational measurement was high, with both ICCs above 0.9. Rotational outcome assessed by the new method was the same as that concluded by 3-matic software. The agreement for rotational outcome among orthopaedic surgeons based on overall impression was poor (kappa statistic, 0.250 to 0.426). Compared with the 3D reconstruction outcome, the

  15. Universal Quantum Computing with Measurement-Induced Continuous-Variable Gate Sequence in a Loop-Based Architecture.

    Science.gov (United States)

    Takeda, Shuntaro; Furusawa, Akira

    2017-09-22

    We propose a scalable scheme for optical quantum computing using measurement-induced continuous-variable quantum gates in a loop-based architecture. Here, time-bin-encoded quantum information in a single spatial mode is deterministically processed in a nested loop by an electrically programmable gate sequence. This architecture can process any input state and an arbitrary number of modes with almost minimum resources, and offers a universal gate set for both qubits and continuous variables. Furthermore, quantum computing can be performed fault tolerantly by a known scheme for encoding a qubit in an infinite-dimensional Hilbert space of a single light mode.

  16. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  17. A Computer-Based Glucose Management System Reduces the Incidence of Forgotten Glucose Measurements: A Retrospective Observational Study.

    Science.gov (United States)

    Okura, Tsuyoshi; Teramoto, Kei; Koshitani, Rie; Fujioka, Yohei; Endo, Yusuke; Ueki, Masaru; Kato, Masahiko; Taniguchi, Shin-Ichi; Kondo, Hiroshi; Yamamoto, Kazuhiro

    2018-04-17

    Frequent glucose measurements are needed for good blood glucose control in hospitals; however, this requirement means that measurements can be forgotten. We developed a novel glucose management system using an iPod ® and electronic health records. A time schedule system for glucose measurement was developed using point-of-care testing, an iPod ® , and electronic health records. The system contains the glucose measurement schedule and an alarm sounds if a measurement is forgotten. The number of times measurements were forgotten was analyzed. Approximately 7000 glucose measurements were recorded per month. Before implementation of the system, the average number of times measurements were forgotten was 4.8 times per month. This significantly decreased to 2.6 times per month after the system started. We also analyzed the incidence of forgotten glucose measurements as a proportion of the total number of measurements for each period and found a significant difference between the two 9-month periods (43/64,049-24/65,870, P = 0.014, chi-squared test). This computer-based blood glucose monitoring system is useful for the management of glucose monitoring in hospitals. Johnson & Johnson Japan.

  18. Optimizing a micro-computed tomography-based surrogate measurement of bone-implant contact.

    Science.gov (United States)

    Meagher, Matthew J; Parwani, Rachna N; Virdi, Amarjit S; Sumner, Dale R

    2018-03-01

    Histology and backscatter scanning electron microscopy (bSEM) are the current gold standard methods for quantifying bone-implant contact (BIC), but are inherently destructive. Microcomputed tomography (μCT) is a non-destructive alternative, but attempts to validate μCT-based assessment of BIC in animal models have produced conflicting results. We previously showed in a rat model using a 1.5 mm diameter titanium implant that the extent of the metal-induced artefact precluded accurate measurement of bone sufficiently close to the interface to assess BIC. Recently introduced commercial laboratory μCT scanners have smaller voxels and improved imaging capabilities, possibly overcoming this limitation. The goals of the present study were to establish an approach for optimizing μCT imaging parameters and to validate μCT-based assessment of BIC. In an empirical parametric study using a 1.5 mm diameter titanium implant, we determined 90 kVp, 88 µA, 1.5 μm isotropic voxel size, 1600 projections/180°, and 750 ms integration time to be optimal. Using specimens from an in vivo rat experiment, we found significant correlations between bSEM and μCT for BIC with the manufacturer's automated analysis routine (r = 0.716, p = 0.003) or a line-intercept method (r = 0.797, p = 0.010). Thus, this newer generation scanner's improved imaging capability reduced the extent of the metal-induced artefact zone enough to permit assessment of BIC. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:979-986, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  19. 2-D Low Energy Electron Beam Profile Measurement Based on Computer Tomography Algorithm with Multi-Wire Scanner

    CERN Document Server

    Yu, Nengjie; Li Qing Feng; Tang, Chuan-Xiang; Zheng, Shuxin

    2005-01-01

    A new method for low energy electron beam profile measurement is advanced, which presents a full 2-D beam profile distribution other than the traditional 2-D beam profile distribution given by 1-D vertical and horizontal beam profiles. The method is based on the CT (Computer Tomography) algorithm. Multi-sets of data about the 1-D beam profile projections are attained by rotating the multi-wire scanner. Then a 2-D beam profile is reconstructed from these projections with CT algorithm. The principle of this method is presented. The simulation and the experiment results are compared and analyzed in detail.

  20. Automated Computer-Based Facility for Measurement of Near-Field Structure of Microwave Radiators and Scatterers

    DEFF Research Database (Denmark)

    Mishra, Shantnu R.;; Pavlasek, Tomas J. F.;; Muresan, Letitia V.

    1980-01-01

    An automatic facility for measuring the three-dimensional structure of the near fields of microwave radiators and scatterers is described. The amplitude and phase for different polarization components can be recorded in analog and digital form using a microprocessor-based system. The stored data...... are transferred to a large high-speed computer for bulk processing and for the production of isophot and equiphase contour maps or profiles. The performance of the system is demonstrated through results for a single conical horn, for interacting rectangular horns, for multiple cylindrical scatterers...

  1. A program to compute geographical positions of underwater artifact based on linear measurements

    Digital Repository Service at National Institute of Oceanography (India)

    Ganesan, P.

    System) or any hydrographic post processing software, excellent site plans and other related maps can be prepared on any convenient scale. This user friendly program enables the marine archaeologists to process their field measurements much faster...

  2. Measurement Properties of Two Innovative Item Formats in a Computer-Based Test

    Science.gov (United States)

    Wan, Lei; Henly, George A.

    2012-01-01

    Many innovative item formats have been proposed over the past decade, but little empirical research has been conducted on their measurement properties. This study examines the reliability, efficiency, and construct validity of two innovative item formats--the figural response (FR) and constructed response (CR) formats used in a K-12 computerized…

  3. HiFi-MBQC High Fidelitiy Measurement-Based Quantum Computing using Superconducting Detectors

    Science.gov (United States)

    2016-04-04

    optical elements such as beam splitters and phase shifters is not possible. Our result provides useful guidance for the design of optical quantum...a tuneable beam splitter and further investigate the measurement-induced interactions of a simulated four-spin system by comparing the entanglement...student Lorenzo Procopio, whose salary is funded by this project, was supporting them with the design . In January 2014 the company could finally reach

  4. Variability of bronchial measurements obtained by sequential CT using two computer-based methods

    International Nuclear Information System (INIS)

    Brillet, Pierre-Yves; Fetita, Catalin I.; Mitrea, Mihai; Preteux, Francoise; Capderou, Andre; Dreuil, Serge; Simon, Jean-Marc; Grenier, Philippe A.

    2009-01-01

    This study aimed to evaluate the variability of lumen (LA) and wall area (WA) measurements obtained on two successive MDCT acquisitions using energy-driven contour estimation (EDCE) and full width at half maximum (FWHM) approaches. Both methods were applied to a database of segmental and subsegmental bronchi with LA > 4 mm 2 containing 42 bronchial segments of 10 successive slices that best matched on each acquisition. For both methods, the 95% confidence interval between repeated MDCT was between -1.59 and 1.5 mm 2 for LA, and -3.31 and 2.96 mm 2 for WA. The values of the coefficient of measurement variation (CV 10 , i.e., percentage ratio of the standard deviation obtained from the 10 successive slices to their mean value) were strongly correlated between repeated MDCT data acquisitions (r > 0.72; p 2 , whereas WA values were lower for bronchi with WA 2 ; no systematic EDCE underestimation or overestimation was observed for thicker-walled bronchi. In conclusion, variability between CT examinations and assessment techniques may impair measurements. Therefore, new parameters such as CV 10 need to be investigated to study bronchial remodeling. Finally, EDCE and FWHM are not interchangeable in longitudinal studies. (orig.)

  5. Measuring consumers' information acquisition and decision behavior with the computer-based information-display-matrix

    DEFF Research Database (Denmark)

    Aschemann-Witzel, Jessica; Hamm, Ulrich

    2011-01-01

    development of the method: starting points are choice of location, increased relevance of choice, individual adjustment of task structure, simplified navigation and realistic layout. Used in multi-measurement-approaches, the IDM can provide detailed background information about consumer information behaviour...... prior to decisions reached in interviews or choice experiments. The contribution introduces to the method and its´ development, use and (dis-)advantages. Results of a survey illustrate the options for analysis and indicate that consumer behaviour in the IDM, compared to face-to-face-interviews, is less...

  6. A computer-assisted test for the electrophysiological and psychophysical measurement of dynamic visual function based on motion contrast.

    Science.gov (United States)

    Wist, E R; Ehrenstein, W H; Schrauf, M; Schraus, M

    1998-03-13

    A new test is described that allows for electrophysiological and psychophysical measurement of visual function based on motion contrast. In a computer-generated random-dot display, completely camouflaged Landolt rings become visible only when dots within the target area are moved briefly while those of the background remain stationary. Thus, detection of contours and the location of the gap in the ring rely on motion contrast (form-from-motion) instead of luminance contrast. A standard version of this test has been used to assess visual performance in relation to age, in screening professional groups (truck drivers) and in clinical groups (glaucoma patients). Aside from this standard version, the computer program easily allows for various modifications. These include the option of a synchronizing trigger signal to allow for recording of time-locked motion-onset visual-evoked responses, the reversal of target and background motion, and the displacement of random-dot targets across stationary backgrounds. In all instances, task difficulty is manipulated by changing the percentage of moving dots within the target (or background). The present test offers a short, convenient method to probe dynamic visual functions relying on surprathreshold motion-contrast stimuli and complements other routine tests of form, contrast, depth, and color vision.

  7. Characterization of cardiac quiescence from retrospective cardiac computed tomography using a correlation-based phase-to-phase deviation measure

    Energy Technology Data Exchange (ETDEWEB)

    Wick, Carson A.; McClellan, James H. [School of Electrical and Computer Engineering, Georgia Institute of Technology, 777 Atlantic Drive Northwest, Atlanta, Georgia 30332 (United States); Arepalli, Chesnal D. [Department of Radiology, University of British Columbia, 3350-950 West 10th Avenue, Vancouver, British Columbia V5Z 4E3 (Canada); Auffermann, William F.; Henry, Travis S. [Department of Radiology and Imaging Sciences, Emory University, Division of Cardiothoracic Imaging, 1364 Clifton Road Northeast, Suite 309, Atlanta, Georgia 30322 (United States); Khosa, Faisal [Department of Radiology and Imaging Sciences, Emory University, Division of Emergency Radiology, 550 Peachtree Street Northeast, Atlanta, Georgia 30308 (United States); Coy, Adam M. [School of Medicine, Emory University, 100 Woodruff Circle, Atlanta, Georgia 30322 (United States); Tridandapani, Srini, E-mail: stridan@emory.edu [Department of Radiology and Imaging Sciences, Emory University, Winship Cancer Institute, 1701 Uppergate Drive Northeast, Suite 5018, Atlanta, Georgia 30322 and School of Electrical and Computer Engineering, Georgia Institute of Technology, 777 Atlantic Drive Northwest, Atlanta, Georgia 30332 (United States)

    2015-02-15

    Purpose: Accurate knowledge of cardiac quiescence is crucial to the performance of many cardiac imaging modalities, including computed tomography coronary angiography (CTCA). To accurately quantify quiescence, a method for detecting the quiescent periods of the heart from retrospective cardiac computed tomography (CT) using a correlation-based, phase-to-phase deviation measure was developed. Methods: Retrospective cardiac CT data were obtained from 20 patients (11 male, 9 female, 33–74 yr) and the left main, left anterior descending, left circumflex, right coronary artery (RCA), and interventricular septum (IVS) were segmented for each phase using a semiautomated technique. Cardiac motion of individual coronary vessels as well as the IVS was calculated using phase-to-phase deviation. As an easily identifiable feature, the IVS was analyzed to assess how well it predicts vessel quiescence. Finally, the diagnostic quality of the reconstructed volumes from the quiescent phases determined using the deviation measure from the vessels in aggregate and the IVS was compared to that from quiescent phases calculated by the CT scanner. Three board-certified radiologists, fellowship-trained in cardiothoracic imaging, graded the diagnostic quality of the reconstructions using a Likert response format: 1 = excellent, 2 = good, 3 = adequate, 4 = nondiagnostic. Results: Systolic and diastolic quiescent periods were identified for each subject from the vessel motion calculated using the phase-to-phase deviation measure. The motion of the IVS was found to be similar to the aggregate vessel (AGG) motion. The diagnostic quality of the coronary vessels for the quiescent phases calculated from the aggregate vessel (P{sub AGG}) and IVS (P{sub IV} {sub S}) deviation signal using the proposed methods was comparable to the quiescent phases calculated by the CT scanner (P{sub CT}). The one exception was the RCA, which improved for P{sub AGG} for 18 of the 20 subjects when compared to P

  8. Preclinical validation of automated dual-energy X-ray absorptiometry and computed tomography-based body composition measurements

    International Nuclear Information System (INIS)

    DEVRIESE, Joke; Pottel, Hans; BEELS, Laurence; VAN DE WIELE, Christophe; MAES, Alex; GHEYSENS, Olivier

    2016-01-01

    The aim of this study was to determine and validate a set of Hounsfield unit (HU) ranges to segment computed tomography (CT) images into tissue types and to test the validity of dual-energy X-ray absorptiometry (DXA) tissue segmentation on pure, unmixed porcine tissues. This preclinical prospective study was approved by the local ethical committee. Different quantities of porcine bone tissue (BT), lean tissue (LT) and adipose tissue (AT) were scanned using DXA and CT. Tissue type segmentation in DXA was performed via the standard clinical protocol and in CT through different sets of HU ranges. Percent coefficients of variation (%CV) were used to assess precision while % differences of observed masses were tested against zero using the Wilcoxon signed-rank Test. Total mass DXA measurements differ little but significantly (P=0.016) from true mass, while total mass CT measurements based on literature values show non-significant (P=0.69) differences of 1.7% and 2.0%. BT mass estimates with DXA differed more from true mass (median -78.2 to -75.8%) than other tissue types (median -11.3 to -8.1%). Tissue mass estimates with CT and literature HU ranges showed small differences from true mass for every tissue type (median -10.4 to 8.8%). The most suited method for automated tissue segmentation is CT and can become a valuable tool in quantitative nuclear medicine.

  9. Computer Based Expert Systems.

    Science.gov (United States)

    Parry, James D.; Ferrara, Joseph M.

    1985-01-01

    Claims knowledge-based expert computer systems can meet needs of rural schools for affordable expert advice and support and will play an important role in the future of rural education. Describes potential applications in prediction, interpretation, diagnosis, remediation, planning, monitoring, and instruction. (NEC)

  10. A new characterization procedure for computed radiography performance levels based on EPS, SNR and basic spatial resolution measurements

    International Nuclear Information System (INIS)

    Ewert, Uwe; Zscherpel, Uwe; Baer, Sylke

    2016-01-01

    The standards EN 14784-1:2005 and ISO 16371-1:2011 describe the classification of Computed Radiography systems for industrial applications. After 10 years of classification experience, it can be concluded that all certified NDT CR systems achieve the best classification result: IP 1. The measured basic spatial resolution is different depending on the manufacturer's brand and the IP used. Therefore, a revision was recommended to obtain a better gradation for the different brands. Users in USA and Europe classify the CR systems based on different parameters. Consequently, a new revision of ASTM E 2446-15 was finalized in 2015, which describes the characterization of CR systems based on CR performance levels. The key parameters are the normalized Signal to Noise Ratio (SNRN), the interpolated basic spatial resolution (iSR b detector ) and the achieved equivalent penetrameter sensitivity (aEPS). A series of further tests is required for complete characterization by manufacturers or certifying laboratories. This includes e.g.: geometric distortion, laser jitter, PMT non-linearity, scanner slippage, shading or banding, erasure, burn-In, spatial linearity, artefacts, imaging plate response variation and imaging plate fading. ASTM E 2445-15 describes several tests, for users to perform periodic quality assurance. The measurement procedures are described and the resulting values as CR speed, achieved contrast sensitivity and efficiency are discussed. The results will be presented graphically in a spider net graph in the qualification/certification statement. A revision of the related CEN and ISO standards is discussed.

  11. Radon migration from soil to a house. Computer modelling and its verification on the base of measurements in living houses

    International Nuclear Information System (INIS)

    Janik, M.

    2005-09-01

    In this thesis the Loureiro model of radon migration from soil to a house has been verified. For this purpose the computer code TRIAD has been used. The input data for this code are the ground penetrability and porosity. Methods for determination of these parameters have bee developed and equipment for its measurements has been manufactured and tested

  12. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  13. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    International Nuclear Information System (INIS)

    Precht, H.; Kitslaar, P.H.; Broersen, A.; Gerke, O.; Dijkstra, J.; Thygesen, J.; Egstrup, K.; Lambrechtsen, J.

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model-based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) images on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods: Three patients had three independent dose reduced CCTA performed and reconstructed with 30% ASIR (CTDI vol at 6.7 mGy), 60% ASIR (CTDI vol 4.3 mGy) and Veo (CTDI vol at 1.9 mGy). Coronary plaque analysis was performed for each measured CCTA volumes, plaque burden and intensities. Results: Plaque volume and plaque burden show a decreasing tendency from ASIR to Veo as median volume for ASIR is 314 mm 3 and 337 mm 3 –252 mm 3 for Veo and plaque burden is 42% and 44% for ASIR to 39% for Veo. The lumen and vessel volume decrease slightly from 30% ASIR to 60% ASIR with 498 mm 3 –391 mm 3 for lumen volume and vessel volume from 939 mm 3 to 830 mm 3 . The intensities did not change overall between the different reconstructions for either lumen or plaque. Conclusion: We found a tendency of decreasing plaque volumes and plaque burden but no change in intensities with the use of low dose Veo CCTA (1.9 mGy) compared to dose reduced ASIR CCTA (6.7 mGy & 4.3 mGy), although more studies are warranted. - Highlights: • Veo decrease plaque volumes and plaque burden using low-dose CCTA. • Moving from ASIR 30%, ASIR 60% to Veo did not appear to influence the plaque intensities. • Studies including larger sample size are needed to investigate the effect on plaque.

  14. Ammonia-based quantum computer

    International Nuclear Information System (INIS)

    Ferguson, Andrew J.; Cain, Paul A.; Williams, David A.; Briggs, G. Andrew D.

    2002-01-01

    We propose a scheme for quantum computation using two eigenstates of ammonia or similar molecules. Individual ammonia molecules are confined inside fullerenes and used as two-level qubit systems. Interaction between these ammonia qubits takes place via the electric dipole moments, and in particular we show how a controlled-NOT gate could be implemented. After computation the qubit is measured with a single-electron electrometer sensitive enough to differentiate between the dipole moments of different states. We also discuss a possible implementation based on a quantum cellular automaton

  15. Efficient quantum computing with weak measurements

    International Nuclear Information System (INIS)

    Lund, A P

    2011-01-01

    Projective measurements with high quantum efficiency are often assumed to be required for efficient circuit-based quantum computing. We argue that this is not the case and show that the fact that they are not required was actually known previously but was not deeply explored. We examine this issue by giving an example of how to perform the quantum-ordering-finding algorithm efficiently using non-local weak measurements considering that the measurements used are of bounded weakness and some fixed but arbitrary probability of success less than unity is required. We also show that it is possible to perform the same computation with only local weak measurements, but this must necessarily introduce an exponential overhead.

  16. Data-processing system for bubble-chamber photographs based on PUOS-4 measuring projectors and an ES-1045 computer

    International Nuclear Information System (INIS)

    Ermolov, P.F.; Kozlov, V.V.; Rukovichkin, V.P.

    1988-01-01

    A system is described that was developed at the Scientific-Research Institute of Nuclear Physics for processing of the data recorded on stereoscopic photographs from large bubble chambers and hybrid spectrometers using PUOS-4 measuring projectors, an Elektronika-60 microcomputer, and an ES-1045 computer. The system structure, the main programmable interfaces, and the intercomputer communications are examined. The mean-square error of the measuring channels of the system, determined from calibration measurements, is within 1.3-3.5 μm; the standard deviation of the coordinates of the measured points with respect to the track in the plane of the photograph is 6 μm. The system is widely used at the institute for analysis of data from experiments in high-energy physics performed with the European Hybrid Spectrometer and the Mirabel large bubble chamber. Approximately 80,000 stereoscopic photographs have been processed and the system is being prepared to process data from the Skat bubble chamber and a spectrometer with a vertex detector that is under construction

  17. Reducing overlay sampling for APC-based correction per exposure by replacing measured data with computational prediction

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc

    2016-03-01

    One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.

  18. Operators manual for a computer controlled impedance measurement system

    Science.gov (United States)

    Gordon, J.

    1987-02-01

    Operating instructions of a computer controlled impedance measurement system based in Hewlett Packard instrumentation are given. Hardware details, program listings, flowcharts and a practical application are included.

  19. Measurement stand for diagnosis of semiconductor detectors based on IBM PC/XT computer (4-way spectrometric analysis of pulses)

    International Nuclear Information System (INIS)

    Gruszecki, M.

    1990-01-01

    The technical assumptions and partial realization of our technological stand for quality inspection of semiconductor detectors for ionizing radiation manufactured in the INP in Cracow are described. To increase the efficiency of the measurements simultaneous checking of 4 semiconductor chips or finished products is suggested. In order to justify this measurement technique a review of possible variants of the measurement apparatus is presented for the systems consisting of home made units. Comparative parameters for the component modules and for complete measuring systems are given. The construction and operation of data acquisition system based on IBM PC/XT are described. The system ensures simultaneous registration of pulses obtained from 4 detectors with maximal rate of up to 500 x 10 3 pulses/s. 42 refs., 6 figs., 3 tabs. (author)

  20. Wavefront measurement using computational adaptive optics.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  1. Multiple computer-based methods of measuring joint space width can discriminate between treatment arms in the COBRA trial -- Update of an ongoing OMERACT project.

    Science.gov (United States)

    Sharp, John T; Angwin, Jane; Boers, Maarten; Duryea, Jeff; Finckh, Axel; Hall, James R; Kauffman, Joost A; Landewé, Robert; Langs, Georg; Lukas, Cédric; Moens, H J Bernelot; Peloschek, Philipp; Strand, C Vibeke; van der Heijde, Désirée

    2009-08-01

    Previously reported data on 5 computer-based programs for measurement of joint space width focusing on discriminating ability and reproducibility are updated, showing new data. Four of 5 different programs for measuring joint space width were more discriminating than observer scoring for change in narrowing in the 12 months interval. Three of 4 programs were more discriminating than observer scoring for the 0-18 month interval. The program that failed to discriminate in the 0-12 month interval was not the same program that failed in the 0-18 month interval. The committee agreed at an interim meeting in November 2007 that an important goal for computer-based measurement programs is a 90% success rate in making measurements of joint pairs in followup studies. This means that the same joint must be measured in images of both timepoints in order to assess change over time in serial radiographs. None of the programs met this 90% threshold, but 3 programs achieved 85%-90% success rate. Intraclass correlation coefficients for assessing change in joint space width in individual joints were 0.98 or 0.99 for 4 programs. The smallest detectable change was < 0.2 mm for 4 of the 5 programs, representing 29%-36% of the change within the 99th percentile of measurements.

  2. Evaluating measurement invariance across assessment modes of phone interview and computer self-administered survey for the PROMIS measures in a population-based cohort of localized prostate cancer survivors.

    Science.gov (United States)

    Wang, Mian; Chen, Ronald C; Usinger, Deborah S; Reeve, Bryce B

    2017-11-01

    To evaluate measurement invariance (phone interview vs computer self-administered survey) of 15 PROMIS measures responded by a population-based cohort of localized prostate cancer survivors. Participants were part of the North Carolina Prostate Cancer Comparative Effectiveness and Survivorship Study. Out of the 952 men who took the phone interview at 24 months post-treatment, 401 of them also completed the same survey online using a home computer. Unidimensionality of the PROMIS measures was examined using single-factor confirmatory factor analysis (CFA) models. Measurement invariance testing was conducted using longitudinal CFA via a model comparison approach. For strongly or partially strongly invariant measures, changes in the latent factors and factor autocorrelations were also estimated and tested. Six measures (sleep disturbance, sleep-related impairment, diarrhea, illness impact-negative, illness impact-positive, and global satisfaction with sex life) had locally dependent items, and therefore model modifications had to be made on these domains prior to measurement invariance testing. Overall, seven measures achieved strong invariance (all items had equal loadings and thresholds), and four measures achieved partial strong invariance (each measure had one item with unequal loadings and thresholds). Three measures (pain interference, interest in sexual activity, and global satisfaction with sex life) failed to establish configural invariance due to between-mode differences in factor patterns. This study supports the use of phone-based live interviewers in lieu of PC-based assessment (when needed) for many of the PROMIS measures.

  3. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.H.P.

    2015-01-01

    Let $A$ be a Dedekind domain, $K$ the fraction field of $A$, and $f\\in A[x]$ a monic irreducible separable polynomial. For a given non-zero prime ideal $\\mathfrak{p}$ of $A$ we present in this paper a new method to compute a $\\mathfrak{p}$-integral basis of the extension of $K$ determined by $f$.

  4. Exploring the use of tablet computer-based electronic data capture system to assess patient reported measures among patients with chronic kidney disease: a pilot study.

    Science.gov (United States)

    Wong, Dorothy; Cao, Shen; Ford, Heather; Richardson, Candice; Belenko, Dmitri; Tang, Evan; Ugenti, Luca; Warsmann, Eleanor; Sissons, Amanda; Kulandaivelu, Yalinie; Edwards, Nathaniel; Novak, Marta; Li, Madeline; Mucsi, Istvan

    2017-12-06

    Collecting patient reported outcome measures (PROMs) via computer-based electronic data capture system may improve feasibility and facilitate implementation in clinical care. We report our initial experience about the acceptability of touch-screen tablet computer-based, self-administered questionnaires among patients with chronic kidney disease (CKD), including stage 5 CKD treated with renal replacement therapies (RRT) (either dialysis or transplant). We enrolled a convenience sample of patients with stage 4 and 5 CKD (including patients on dialysis or after kidney transplant) in a single-centre, cross-sectional pilot study. Participants completed validated questionnaires programmed on an electronic data capture system (DADOS, Techna Inc., Toronto) on tablet computers. The primary objective was to evaluate the acceptability and feasibility of using tablet-based electronic data capture in patients with CKD. Descriptive statistics, Fischer's exact test and multivariable logistic regression models were used for data analysis. One hundred and twenty one patients (55% male, mean age (± SD) of 58 (±14) years, 49% Caucasian) participated in the study. Ninety-two percent of the respondents indicated that the computer tablet was acceptable and 79% of the participants required no or minimal help for completing the questionnaires. Acceptance of tablets was lower among patients 70 years or older (75% vs. 95%; p = 0.011) and with little previous computer experience (81% vs. 96%; p = 0.05). Furthermore, a greater level of assistance was more frequently required by patients who were older (45% vs. 15%; p = 0.009), had lower level of education (33% vs. 14%; p = 0.027), low health literacy (79% vs. 12%; p = 0.027), and little previous experience with computers (52% vs. 10%; p = 0.027). Tablet computer-based electronic data capture to administer PROMs was acceptable and feasible for most respondents and could therefore be used to systematically assess PROMs

  5. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...... coherent light having a wavelength along an input light path, - producing scattering of said light from each of a plurality of interfaces within said apparatus including interfaces between said fluid and a surface bounding said fluid, said scattering producing an interference pattern formed by said...... scattered light, - cyclically varying the wavelength of said light in said input light path over a 1 nm to 20nm wide range of wavelengths a rate of from 10Hz to 50 KHz, - recording variation of intensity of the interfering light with change in wavelength of the light at an angle of observation...

  6. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  7. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  8. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  9. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  10. Measurement-only topological quantum computation without forced measurements

    International Nuclear Information System (INIS)

    Zheng, Huaixiu; Dua, Arpit; Jiang, Liang

    2016-01-01

    We investigate the measurement-only topological quantum computation (MOTQC) approach proposed by Bonderson et al (2008 Phys. Rev. Lett. 101 010501) where the braiding operation is shown to be equivalent to a series of topological charge ‘forced measurements’ of anyons. In a forced measurement, the charge measurement is forced to yield the desired outcome (e.g. charge 0) via repeatedly measuring charges in different bases. This is a probabilistic process with a certain success probability for each trial. In practice, the number of measurements needed will vary from run to run. We show that such an uncertainty associated with forced measurements can be removed by simulating the braiding operation using a fixed number of three measurements supplemented by a correction operator. Furthermore, we demonstrate that in practice we can avoid applying the correction operator in hardware by implementing it in software. Our findings greatly simplify the MOTQC proposal and only require the capability of performing charge measurements to implement topologically protected transformations generated by braiding exchanges without physically moving anyons. (paper)

  11. Computer controlled quality of analytical measurements

    International Nuclear Information System (INIS)

    Clark, J.P.; Huff, G.A.

    1979-01-01

    A PDP 11/35 computer system is used in evaluating analytical chemistry measurements quality control data at the Barnwell Nuclear Fuel Plant. This computerized measurement quality control system has several features which are not available in manual systems, such as real-time measurement control, computer calculated bias corrections and standard deviation estimates, surveillance applications, evaluaton of measurement system variables, records storage, immediate analyst recertificaton, and the elimination of routine analysis of known bench standards. The effectiveness of the Barnwell computer system has been demonstrated in gathering and assimilating the measurements of over 1100 quality control samples obtained during a recent plant demonstration run. These data were used to determine equaitons for predicting measurement reliability estimates (bias and precision); to evaluate the measurement system; and to provide direction for modification of chemistry methods. The analytical chemistry measurement quality control activities represented 10% of the total analytical chemistry effort

  12. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  13. Phantom-less bone mineral density (BMD) measurement using dual energy computed tomography-based 3-material decomposition

    Science.gov (United States)

    Hofmann, Philipp; Sedlmair, Martin; Krauss, Bernhard; Wichmann, Julian L.; Bauer, Ralf W.; Flohr, Thomas G.; Mahnken, Andreas H.

    2016-03-01

    Osteoporosis is a degenerative bone disease usually diagnosed at the manifestation of fragility fractures, which severely endanger the health of especially the elderly. To ensure timely therapeutic countermeasures, noninvasive and widely applicable diagnostic methods are required. Currently the primary quantifiable indicator for bone stability, bone mineral density (BMD), is obtained either by DEXA (Dual-energy X-ray absorptiometry) or qCT (quantitative CT). Both have respective advantages and disadvantages, with DEXA being considered as gold standard. For timely diagnosis of osteoporosis, another CT-based method is presented. A Dual Energy CT reconstruction workflow is being developed to evaluate BMD by evaluating lumbar spine (L1-L4) DE-CT images. The workflow is ROI-based and automated for practical use. A dual energy 3-material decomposition algorithm is used to differentiate bone from soft tissue and fat attenuation. The algorithm uses material attenuation coefficients on different beam energy levels. The bone fraction of the three different tissues is used to calculate the amount of hydroxylapatite in the trabecular bone of the corpus vertebrae inside a predefined ROI. Calibrations have been performed to obtain volumetric bone mineral density (vBMD) without having to add a calibration phantom or to use special scan protocols or hardware. Accuracy and precision are dependent on image noise and comparable to qCT images. Clinical indications are in accordance with the DEXA gold standard. The decomposition-based workflow shows bone degradation effects normally not visible on standard CT images which would induce errors in normal qCT results.

  14. On Elasticity Measurement in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ai

    2016-01-01

    Full Text Available Elasticity is the foundation of cloud performance and can be considered as a great advantage and a key benefit of cloud computing. However, there is no clear, concise, and formal definition of elasticity measurement, and thus no effective approach to elasticity quantification has been developed so far. Existing work on elasticity lack of solid and technical way of defining elasticity measurement and definitions of elasticity metrics have not been accurate enough to capture the essence of elasticity measurement. In this paper, we present a new definition of elasticity measurement and propose a quantifying and measuring method using a continuous-time Markov chain (CTMC model, which is easy to use for precise calculation of elasticity value of a cloud computing platform. Our numerical results demonstrate the basic parameters affecting elasticity as measured by the proposed measurement approach. Furthermore, our simulation and experimental results validate that the proposed measurement approach is not only correct but also robust and is effective in computing and comparing the elasticity of cloud platforms. Our research in this paper makes significant contribution to quantitative measurement of elasticity in cloud computing.

  15. Measuring Weld Profiles By Computer Tomography

    Science.gov (United States)

    Pascua, Antonio G.; Roy, Jagatjit

    1990-01-01

    Noncontacting, nondestructive computer tomography system determines internal and external contours of welded objects. System makes it unnecessary to take metallurgical sections (destructive technique) or to take silicone impressions of hidden surfaces (technique that contaminates) to inspect them. Measurements of contours via tomography performed 10 times as fast as measurements via impression molds, and tomography does not contaminate inspected parts.

  16. Computer-Based Career Interventions.

    Science.gov (United States)

    Mau, Wei-Cheng

    The possible utilities and limitations of computer-assisted career guidance systems (CACG) have been widely discussed although the effectiveness of CACG has not been systematically considered. This paper investigates the effectiveness of a theory-based CACG program, integrating Sequential Elimination and Expected Utility strategies. Three types of…

  17. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  18. Brain connectivity measures: computation and comparison

    Directory of Open Access Journals (Sweden)

    Jovanović Aleksandar

    2013-12-01

    Full Text Available In this article computation and comparison of causality measures which are used in determination of brain connectivity patterns is investigated. Main analyzed examples included published computation and comparisons of Directed Transfer Function ‐ DTF and Partial Directed Coherence ‐ PDC. It proved that serious methodology mistakes were involved in measure computations and comparisons. It is shown that the neighborhood of zero is of accented importance in such evaluations and that the issues of semantic stability have to be treated with more attention. Published results on the relationship of these two important measures are partly unstable with small changes of zero threshold and pictures of involved brain structures deduced from the cited articles have to be corrected. Analysis of the operators involved in evaluation and comparisons is given with suggestions for their improvement and complementary additional actions.

  19. Kr-85m activity as burnup measurement indicator in a pebble bed reactor based on ORIGEN2.1 Computer Simulation

    Science.gov (United States)

    Husnayani, I.; Udiyani, P. M.; Bakhri, S.; Sunaryo, G. R.

    2018-02-01

    Pebble Bed Reactor (PBR) is a high temperature gas-cooled reactor which employs graphite as a moderator and helium as a coolant. In a multi-pass PBR, burnup of the fuel pebble must be measured in each cycle by online measurement in order to determine whether the fuel pebble should be reloaded into the core for another cycle or moved out of the core into spent fuel storage. One of the well-known methods for measuring burnup is based on the activity of radionuclide decay inside the fuel pebble. In this work, the activity and gamma emission of Kr-85m were studied in order to investigate the feasibility of Kr-85m as burnup measurement indicator in a PBR. The activity and gamma emission of Kr-85 were estimated using ORIGEN2.1 computer code. The parameters of HTR-10 were taken as a case study in performing ORIGEN2.1 simulation. The results show that the activity revolution of Kr-85m has a good relationship with the burnup of the pebble fuel in each cycle. The Kr-85m activity reduction in each burnup step,in the range of 12% to 4%, is considered sufficient to show the burnup level in each cycle. The gamma emission of Kr-85m is also sufficiently high which is in the order of 1010 photon/second. From these results, it can be concluded that Kr-85m is suitable to be used as burnup measurement indicator in a pebble bed reactor.

  20. Computational methods for industrial radiation measurement applications

    International Nuclear Information System (INIS)

    Gardner, R.P.; Guo, P.; Ao, Q.

    1996-01-01

    Computational methods have been used with considerable success to complement radiation measurements in solving a wide range of industrial problems. The almost exponential growth of computer capability and applications in the last few years leads to a open-quotes black boxclose quotes mentality for radiation measurement applications. If a black box is defined as any radiation measurement device that is capable of measuring the parameters of interest when a wide range of operating and sample conditions may occur, then the development of computational methods for industrial radiation measurement applications should now be focused on the black box approach and the deduction of properties of interest from the response with acceptable accuracy and reasonable efficiency. Nowadays, increasingly better understanding of radiation physical processes, more accurate and complete fundamental physical data, and more advanced modeling and software/hardware techniques have made it possible to make giant strides in that direction with new ideas implemented with computer software. The Center for Engineering Applications of Radioisotopes (CEAR) at North Carolina State University has been working on a variety of projects in the area of radiation analyzers and gauges for accomplishing this for quite some time, and they are discussed here with emphasis on current accomplishments

  1. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  2. Relation of Structural and Vibratory Kinematics of the Vocal Folds to Two Acoustic Measures of Breathy Voice Based on Computational Modeling

    Science.gov (United States)

    Samlan, Robin A.; Story, Brad H.

    2011-01-01

    Purpose: To relate vocal fold structure and kinematics to 2 acoustic measures: cepstral peak prominence (CPP) and the amplitude of the first harmonic relative to the second (H1-H2). Method: The authors used a computational, kinematic model of the medial surfaces of the vocal folds to specify features of vocal fold structure and vibration in a…

  3. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  4. Evaluation of linear measurements of implant sites based o head orientation during acquisition: An ex vivo study using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sabban, Hanadi; Mahdian, Mina; Dhingra, Ajay; Lurie, Alan G.; Tadinada, Aditya [University of Connecticut School of Dental Medicine, Farmington (United States)

    2015-06-15

    This study evaluated the effect of various head orientations during cone-beam computed tomography (CBCT) image acquisition on linear measurements of potential implant sites. Six dry human skulls with a total of 28 implant sites were evaluated for seven different head orientations. The scans were acquired using a Hitachi CB-MercuRay CBCT machine. The scanned volumes were reconstructed. Horizontal and vertical measurements were made and were compared to measurements made after simulating the head position to corrected head angulations. Data was analyzed using a two-way ANOVA test. Statistical analysis revealed a significant interaction between the mean errors in vertical measurements with a marked difference observed at the extension head position (P<0.05). Statistical analysis failed to yield any significant interaction between the mean errors in horizontal measurements at various head positions. Head orientation could significantly affect the vertical measurements in CBCT scans. The main head position influencing the measurements is extension.

  5. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  6. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  7. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  8. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  9. Computation of Difference Grobner Bases

    Directory of Open Access Journals (Sweden)

    Vladimir P. Gerdt

    2012-07-01

    Full Text Available This paper is an updated and extended version of our note \\cite{GR'06} (cf.\\ also \\cite{GR-ACAT}. To compute difference \\Gr bases of ideals generated by linear polynomials we adopt to difference polynomial rings the involutive algorithm based on Janet-like division. The algorithm has been implemented in Maple in the form of the package LDA (Linear Difference Algebra and we describe the main features of the package. Its applications are illustrated by generation of finite difference approximations to linear partial differential equations and by reduction of Feynman integrals. We also present the algorithm for an ideal generated by a finite set of nonlinear difference polynomials. If the algorithm terminates, then it constructs a \\Gr basis of the ideal.

  10. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...... by the apparent shift being consistent with one of a number of numerical possibilities for the real shift which differ by 2n are resolved by combining measurements performed on the same sample using light paths therethrough of differing lengths....

  11. Fast Computations for Measures of Phylogenetic Beta Diversity.

    Directory of Open Access Journals (Sweden)

    Constantinos Tsirogiannis

    Full Text Available For many applications in ecology, it is important to examine the phylogenetic relations between two communities of species. More formally, let [Formula: see text] be a phylogenetic tree and let A and B be two samples of its tips, representing the examined communities. We want to compute a value that expresses the phylogenetic diversity between A and B in [Formula: see text]. There exist several measures that can do this; these are the so-called phylogenetic beta diversity (β-diversity measures. Two popular measures of this kind are the Community Distance (CD and the Common Branch Length (CBL. In most applications, it is not sufficient to compute the value of a beta diversity measure for two communities A and B; we also want to know if this value is relatively large or small compared to all possible pairs of communities in [Formula: see text] that have the same size. To decide this, the ideal approach is to compute a standardised index that involves the mean and the standard deviation of this measure among all pairs of species samples that have the same number of elements as A and B. However, no method exists for computing exactly and efficiently this index for CD and CBL. We present analytical expressions for computing the expectation and the standard deviation of CD and CBL. Based on these expressions, we describe efficient algorithms for computing the standardised indices of the two measures. Using standard algorithmic analysis, we provide guarantees on the theoretical efficiency of our algorithms. We implemented our algorithms and measured their efficiency in practice. Our implementations compute the standardised indices of CD and CBL in less than twenty seconds for a hundred pairs of samples on trees with 7 ⋅ 10(4 tips. Our implementations are available through the R package PhyloMeasures.

  12. Field microcomputerized multichannel γ ray spectrometer based on notebook computer

    International Nuclear Information System (INIS)

    Jia Wenyi; Wei Biao; Zhou Rongsheng; Li Guodong; Tang Hong

    1996-01-01

    Currently, field γ ray spectrometry can not rapidly measure γ ray full spectrum, so a field microcomputerized multichannel γ ray spectrometer based on notebook computer is described, and the γ ray full spectrum can be rapidly measured in the field

  13. Development of a tomographic system adapted to 3D measurement of contaminated wounds based on the Cacao concept (Computer aided collimation Gamma Camera)

    International Nuclear Information System (INIS)

    Douiri, A.

    2002-03-01

    The computer aided collimation gamma camera (CACAO in French) is a gamma camera using a collimator with large holes, a supplementary linear scanning motion during the acquisition and a dedicated reconstruction program taking full account of the source depth. The CACAO system was introduced to improve both the sensitivity and the resolution in nuclear medicine. This thesis focuses on the design of a fast and robust reconstruction algorithm in the CACAO project. We start by an overview of tomographic imaging techniques in nuclear medicine. After modelling the physical CACAO system, we present the complete reconstruction program which involves three steps: 1) shift and sum 2) deconvolution and filtering 3) rotation and sum. The deconvolution is the critical step that decreases the signal to noise ratio of the reconstructed images. We propose a regularized multi-channel algorithm to solve the deconvolution problem. We also present a fast algorithm based on Splines functions and preserving the high quality of the reconstructed images for the shift and the rotation steps. Comparisons of simulated reconstructed images in 2D and 3D for the conventional system (CPHC) and CACAO demonstrate the ability of CACAO system to increase the quality of the SPECT images. Finally, this study concludes with an experimental approach with a pixellated detector conceived for a 3D measurement of contaminated wounds. This experimentation proves the possible advantages of coupling the CACAO project with pixellated detectors. Moreover, a variety of applications could fully benefit from the CACAO system, such as low activity imaging, the use of high-energy gamma isotopes and the visualization of deep organs. Moreover the combination of the CACAO system with a pixels detector may open up further possibilities for the future of nuclear medicine. (author)

  14. Computer-supported analysis of scientific measurements

    NARCIS (Netherlands)

    de Jong, Hidde

    1998-01-01

    In the past decade, large-scale databases and knowledge bases have become available to researchers working in a range of scientific disciplines. In many cases these databases and knowledge bases contain measurements of properties of physical objects which have been obtained in experiments or at

  15. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  16. A computer-aided surface roughness measurement system

    International Nuclear Information System (INIS)

    Hughes, F.J.; Schankula, M.H.

    1983-11-01

    A diamond stylus profilometer with computer-based data acquisitions/analysis system is being used to characterize surfaces of reactor components and materials, and to examine the effects of surface topography on thermal contact conductance. The current system is described; measurement problems and system development are discussed in general terms and possible future improvements are outlined

  17. Measuring techniques in emission computed tomography

    International Nuclear Information System (INIS)

    Jordan, K.; Knoop, B.

    1988-01-01

    The chapter reviews the historical development of the emission computed tomography and its basic principles, proceeds to SPECT and PET, special techniques of emission tomography, and concludes with a comprehensive discussion of the mathematical fundamentals of the reconstruction and the quantitative activity determination in vivo, dealing with radon transformation and the projection slice theorem, methods of image reconstruction such as analytical and algebraic methods, limiting conditions in real systems such as limited number of measured data, noise enhancement, absorption, stray radiation, and random coincidence. (orig./HP) With 111 figs., 6 tabs [de

  18. Organization of a library of standard relocatible programmes, or a processing measurement data module based on computers of the TRA types

    International Nuclear Information System (INIS)

    Dadi, K.; Dadi, L.; Mateeva, A.; Salamatin, I.M.

    1976-01-01

    The paper describes the organization of a library of standard programs with binary cade. The library was developed for a measurement module on the basis of a TRA-1001-i computer (Elektronika-100, PDP-8). The library is placed on a external memory (magnetic disk) and has a module structure. The external memory assigned for the library is divided into pages. When loaded into the computer internal memory, several pages are taken as one whole to represent the loading module. The magnetic disk storage capacity being 1.25 million words, the library has a total of ca. 50 10 thousand words (eight cylinders). The work provides regulations for compiling standard programs in SLANG. The library is characterized by the following main features: possibility of being used in memory dynamic distribution mode; possibility of being used for computers with internal memory capacity 4K; no need for intermediary-language coding of displaced program; and possibility of autonomous shift of standard program. The above library is compared with a comprising DES programs library

  19. Blind quantum computation protocol in which Alice only makes measurements

    Science.gov (United States)

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-05-01

    Blind quantum computation is a new secure quantum computing protocol which enables Alice (who does not have sufficient quantum technology) to delegate her quantum computation to Bob (who has a full-fledged quantum computer) in such a way that Bob cannot learn anything about Alice's input, output, and algorithm. In previous protocols, Alice needs to have a device which generates quantum states, such as single-photon states. Here we propose another type of blind computing protocol where Alice does only measurements, such as the polarization measurements with a threshold detector. In several experimental setups, such as optical systems, the measurement of a state is much easier than the generation of a single-qubit state. Therefore our protocols ease Alice's burden. Furthermore, the security of our protocol is based on the no-signaling principle, which is more fundamental than quantum physics. Finally, our protocols are device independent in the sense that Alice does not need to trust her measurement device in order to guarantee the security.

  20. Development of a computational technique to measure cartilage contact area.

    Science.gov (United States)

    Willing, Ryan; Lapner, Michael; Lalone, Emily A; King, Graham J W; Johnson, James A

    2014-03-21

    Computational measurement of joint contact distributions offers the benefit of non-invasive measurements of joint contact without the use of interpositional sensors or casting materials. This paper describes a technique for indirectly measuring joint contact based on overlapping of articular cartilage computer models derived from CT images and positioned using in vitro motion capture data. The accuracy of this technique when using the physiological nonuniform cartilage thickness distribution, or simplified uniform cartilage thickness distributions, is quantified through comparison with direct measurements of contact area made using a casting technique. The efficacy of using indirect contact measurement techniques for measuring the changes in contact area resulting from hemiarthroplasty at the elbow is also quantified. Using the physiological nonuniform cartilage thickness distribution reliably measured contact area (ICC=0.727), but not better than the assumed bone specific uniform cartilage thicknesses (ICC=0.673). When a contact pattern agreement score (s(agree)) was used to assess the accuracy of cartilage contact measurements made using physiological nonuniform or simplified uniform cartilage thickness distributions in terms of size, shape and location, their accuracies were not significantly different (p>0.05). The results of this study demonstrate that cartilage contact can be measured indirectly based on the overlapping of cartilage contact models. However, the results also suggest that in some situations, inter-bone distance measurement and an assumed cartilage thickness may suffice for predicting joint contact patterns. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Computer-Based Linguistic Analysis.

    Science.gov (United States)

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  2. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  3. Computer-Based Learning in Chemistry Classes

    Science.gov (United States)

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  4. Computed Tomography Based Three-dimensional Measurements of Spine Shortening Distance After Posterior Three-column Osteotomies for the Treatment of Severe and Stiff Scoliosis.

    Science.gov (United States)

    Li, Xue-Shi; Huang, Zi-Fang; Deng, Yao-Long; Fan, Heng-Wei; Sui, Wen-Yuan; Wang, Chong-Wen; Yang, Jun-Lin

    2017-07-15

    Retrospective study. This study is to measure and analyze the changes of three-dimensional (3D) distances of spinal column and spinal canal at the three-column osteotomy sites and address their clinical and neurologic significance. Three-column osteotomies were developed to treat severe and stiff spine deformities with insufficient understanding on the safe limit of spine shortening and the relationship between the shortening distance of the spinal column and that of the spinal canal. Records of 52 continuous patients with severe and stiff scoliosis treated with three-column spine osteotomies at our institution from July 2013 to June 2015 were reviewed. The preoperative spinal cord function classification were type A in 31 cases, type B in 10 cases, and type C in 11 cases. The types of osteotomies carried out were extended pedicle subtraction osteotomy in nine patients and posterior vertebral column resection in 43 patients. Multimodality neuromonitoring strategies were adopted intraoperatively. 3D pre- and postoperative spine models were reconstructed from the computed tomography (CT) scans. The distances of convex and concave spinal column and the spinal canal shortening were measured and analyzed. The spinal column shortening distance (SCSD) measured on the 3D models (27.8 mm) were statistically shorter than those measured intraoperatively (32.8 mm) (P column strut graft than in those with bone-on-bone fusion (P column cannot represent that of the central spinal canal in patients with severe scoliosis. The spinal column shortening procedure in appropriately selected patient groups with bone-on-bone fusion is a viable option with the CCSD being significantly shorter than the convex SCSD. 4.

  5. Computer Simulation of Angle-measuring System of Photoelectric Theodolite

    International Nuclear Information System (INIS)

    Zeng, L; Zhao, Z W; Song, S L; Wang, L T

    2006-01-01

    In this paper, a virtual test platform based on malfunction phenomena is designed, using the methods of computer simulation and numerical mask. It is used in the simulation training of angle-measuring system of photoelectric theodolite. Actual application proves that this platform supplies good condition for technicians making deep simulation training and presents a useful approach for the establishment of other large equipment simulation platforms

  6. Short-term Reproducibility of Computed Tomography-based Lung Density Measurements in Alpha-1 Antitrypsin Deficiency and Smokers with Emphysema

    International Nuclear Information System (INIS)

    Shaker, S.B.; Dirksen, A.; Laursen, L.C.; Maltbaek, N.; Christensen, L.; Sander, U.; Seersholm, N.; Skovgaard, L.T.; Nielsen, L.; Kok-Jensen, A.

    2004-01-01

    Purpose: To study the short-term reproducibility of lung density measurements by multi-slice computed tomography (CT) using three different radiation doses and three reconstruction algorithms. Material and Methods: Twenty-five patients with smoker's emphysema and 25 patients with 1-antitrypsin deficiency underwent 3 scans at 2-week intervals. Low-dose protocol was applied, and images were reconstructed with bone, detail, and soft algorithms. Total lung volume (TLV), 15th percentile density (PD-15), and relative area at -910 Hounsfield units (RA-910) were obtained from the images using Pulmo-CMS software. Reproducibility of PD-15 and RA-910 and the influence of radiation dose, reconstruction algorithm, and type of emphysema were then analysed. Results: The overall coefficient of variation of volume adjusted PD-15 for all combinations of radiation dose and reconstruction algorithm was 3.7%. The overall standard deviation of volume-adjusted RA-910 was 1.7% (corresponding to a coefficient of variation of 6.8%). Radiation dose, reconstruction algorithm, and type of emphysema had no significant influence on the reproducibility of PD-15 and RA-910. However, bone algorithm and very low radiation dose result in overestimation of the extent of emphysema. Conclusion: Lung density measurement by CT is a sensitive marker for quantitating both subtypes of emphysema. A CT-protocol with radiation dose down to 16 mAs and soft or detail reconstruction algorithm is recommended

  7. Measurement of kidney by computed tomography

    International Nuclear Information System (INIS)

    Hamada, Tatsumi; Nakagawa, Kenichi; Tamura, Kenji; Yoshida, Akio; Fujii, Koichi

    1983-01-01

    Several measurements of normal kidney in vivo were obtained from computed tomography and were correlated with age, sex and body dimensions. Forty four males and 21 females without a history of renal disease were studied. 1. Angle between renal coronal section and body frontal (degree): The mean value (+- SD) of the angle was 44.0 +- 11.1 for right kidney and 42.3 +- 11.2 for left, with a low correlation coefficient. The angle had no significant correlation with age nor sex. 2. The largest width of kidney (cm): The mean value of the width was 4.6 +- 0.6 for male right kidney, 5.1 +- 0.6 for male left, 4.6 +- 0.7 for female right and 4.7 +- 1.0 for female left. The values correlated with age under 40 positively and over 40 negatively. 3. Renal volume (cm 3 ): Renal volume was calculated by adding together the area measurements obtained from successive 1 cm thick scans, excluding renal sinus. The mean volume was 107 +- 27 for male right kidney, 114 +- 24 for male left, 101 +- 33 for female right and 111 +- 41 for female left. The correlation coefficient of right versus left renal volume was significantly high. Total renal volume, i. e. left + right renal volume, had significant negative correlation with age over 40. 4. CT numbers of kidney: Average value of right kidney was 31.4 +- 6.0 and that of left was 30.7 +- 5.9. Though the correlation coefficient between right and left was nearly 1, no significant correlation was found with other values. (author)

  8. Measurement method of cardiac computed tomography (CT)

    International Nuclear Information System (INIS)

    Watanabe, Shigeru; Yamamoto, Hironori; Yumura, Yasuo; Yoshida, Hideo; Morooka, Nobuhiro

    1980-01-01

    The CT was carried out in 126 cases consisting of 31 normals, 17 cases of mitral stenosis (MS), 8 cases of mitral regurgitation (MR), 11 cases of aortic stenosis (AS), 9 cases of aortic regurgitation (AR), 20 cases of myocardial infarction (MI), 8 cases of atrial septal defect (ASD) and 22 hypertensives. The 20-second scans were performed every 1.5 cm from the 2nd intercostal space to the 5th or 6th intercostal space. The computed tomograms obtained were classified into 8 levels by cross-sectional anatomy; levels of (1) the aortic arch, (2) just beneath the aortic arch, (3) the pulmonary artery bifurcation, (4) the right atrial appendage or the upper right atrium, (5) the aortic root, (6) the upper left ventricle, (7) the mid left ventricle, and (8) the lower left ventricle. The diameter (anteroposterior and transverse) and cross-sectional area were measured about ascending aorta (Ao), descending aorta (AoD), superior vena cava (SVC), inferoir vena cava (IVC), pulmonary artery branch (PA), main pulmonary artery (mPA), left atrium (LA), right atrium (RA), and right ventricular outflow tract (RVOT) on each level where they were clearly distinguished. However, it was difficult to separate cardiac wall from cardiac cavity because there was little difference of X-ray attenuation coefficient between the myocardium and blood. Therefore, on mid ventricular level, diameter and area about total cardiac shadow were measured, and then cardiac ratios to the thorax were respectively calculated. The normal range of their values was shown in table, and abnormal characteristics in cardiac disease were exhibited in comparison with normal values. In MS, diameter and area in LA were significantly larger than normal. In MS and ASD, all the right cardiac system were larger than normal, especially, RA and SVC in MS, PA and RVOT in ASD. The diameter and area of the aortic root was larger in the order of AR, AS and HT than normal. (author)

  9. Measurements of computed tomography radiation scatter

    International Nuclear Information System (INIS)

    Van Every, B.; Petty, R.J.

    1992-01-01

    This paper describes the measurement of scattered radiation from a computed tomography (CT) scanner in a clinical situation and compares the results with those obtained from a CT performance phantom and with data obtained from CT manufacturers. The results are presented as iso-dose contours. There are significant differences between the data obtained and that supplied by manufacturers, both in the shape of the iso-dose contours and in the nominal values. The observed scatter in a clinical situation (for an abdominal scan) varied between 3% and 430% of the manufacturers' stated values, with a marked reduction in scatter noted a the head and feet of the patient. These differences appear to be due to the fact that manufacturers use CT phantoms to obtain scatter data and these phantoms do not provide the same scatter absorption geometry as patients. CT scatter was observed to increase as scan field size and slice thickness increased, whilst there was little change in scatter with changes in gantry tilt and table slew. Using the iso-dose contours, the orientation of the CT scanner can be optimised with regard to the location and shielding requirements of doors and windows. Additionally, the positioning of staff who must remain in the room during scanning can be optimised to minimise their exposure. It is estimated that the data presented allows for realistic radiation protection assessments to be made. 13 refs., 5 tabs., 6 figs

  10. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.

  11. A Computer Controlled Precision High Pressure Measuring System

    Science.gov (United States)

    Sadana, S.; Yadav, S.; Jha, N.; Gupta, V. K.; Agarwal, R.; Bandyopadhyay, A. K.; Saxena, T. K.

    2011-01-01

    A microcontroller (AT89C51) based electronics has been designed and developed for high precision calibrator based on Digiquartz pressure transducer (DQPT) for the measurement of high hydrostatic pressure up to 275 MPa. The input signal from DQPT is converted into a square wave form and multiplied through frequency multiplier circuit over 10 times to input frequency. This input frequency is multiplied by a factor of ten using phased lock loop. Octal buffer is used to store the calculated frequency, which in turn is fed to microcontroller AT89C51 interfaced with a liquid crystal display for the display of frequency as well as corresponding pressure in user friendly units. The electronics developed is interfaced with a computer using RS232 for automatic data acquisition, computation and storage. The data is acquired by programming in Visual Basic 6.0. This system is interfaced with the PC to make it a computer controlled system. The system is capable of measuring the frequency up to 4 MHz with a resolution of 0.01 Hz and the pressure up to 275 MPa with a resolution of 0.001 MPa within measurement uncertainty of 0.025%. The details on the hardware of the pressure measuring system, associated electronics, software and calibration are discussed in this paper.

  12. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  13. Computed Tomographic Measurement of Splenic Size in

    International Nuclear Information System (INIS)

    Sung, Nak Kwan; Woo, Seong Ku; Ko, Young Tae; Kim, Soon Young

    2010-01-01

    Authors analyzed 72 cases of abdominal computed tomography of Korean adults who didn't have any medical reasons to believe the spleen was abnormal. The following criteria were measured with multiple transverse scanning of the entire length of spleen (height, breadth, thickness) relationship with fixed midline structure, the spine (the shortest distance from midline to medial edge of spleen, the longest distance from anterior margin of vertebral body to anterior tip of spleen). The results were as follows: 1. The average size in adult was 8.0±1.5cm in height, 8.6±1.2cm in breadth and 3.4±0.6cm in thickness; in adult female, 7.8±1.1cm, 8.4±1.0cm and 3.4±0.6cm, respectively; total average, 7.9±1.3cm, 8.5±1.1cm and 3.4±0.6cm, respectively. No remarkable difference was noted between both sexes and age groups. 2. The shortest distance from midline to medial edge of spleen was 4.1±1.1cm in male, 3.6±1.0cm in female and total average of 3.9±1.1cm. There was remarkable difference between both sexes (P<0.005) but not between age groups. 3. The longest distance from anterior margin of vertebral body to anterior adge of spleen was 2.3±1.7cm in male, 2.0±1.4cm in female and total average of 2.2±1.6cm. No remarkable difference was seen between both sexes and age groups.

  14. Measurement-only topological quantum computation via anyonic interferometry

    International Nuclear Information System (INIS)

    Bonderson, Parsa; Freedman, Michael; Nayak, Chetan

    2009-01-01

    We describe measurement-only topological quantum computation using both projective and interferometrical measurement of topological charge. We demonstrate how anyonic teleportation can be achieved using 'forced measurement' protocols for both types of measurement. Using this, it is shown how topological charge measurements can be used to generate the braiding transformations used in topological quantum computation, and hence that the physical transportation of computational anyons is unnecessary. We give a detailed discussion of the anyonics for implementation of topological quantum computation (particularly, using the measurement-only approach) in fractional quantum Hall systems

  15. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  16. Game based learning for computer science education

    NARCIS (Netherlands)

    Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus

    2011-01-01

    Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.

  17. Computer-based feedback in formative assessment

    NARCIS (Netherlands)

    van der Kleij, Fabienne

    2013-01-01

    Formative assessment concerns any assessment that provides feedback that is intended to support learning and can be used by teachers and/or students. Computers could offer a solution to overcoming obstacles encountered in implementing formative assessment. For example, computer-based assessments

  18. Evidence for lower variability of coronary artery calcium mineral mass measurements by multi-detector computed tomography in a community-based cohort-Consequences for progression studies

    International Nuclear Information System (INIS)

    Hoffmann, Udo; Siebert, Uwe; Bull-Stewart, Arabella; Achenbach, Stephan; Ferencik, Maros; Moselewski, Fabian; Brady, Thomas J.; Massaro, Joseph M.; O'Donnell, Christopher J.

    2006-01-01

    Purpose: To compare the measurement variability for coronary artery calcium (CAC) measurements using mineral mass compared with a modified Agatston score (AS) or volume score (VS) with multi-detector CT (MDCT) scanning, and to estimate the potential impact of these methods on the design of CAC progression studies. Materials and methods: We studied 162 consecutive subjects (83 women, 79 men, mean age 51 ± 11 years) from a general Caucasian community-based cohort (Framingham Heart Study) with duplicate runs of prospective electrocardiographically-triggered MDCT scanning. Each scan was independently evaluated for the presence of CAC by four experienced observers who determined a 'modified' AS, VS and mineral mass. Results: Of the 162 subjects, CAC was detected in both scans in 69 (42%) and no CAC was detected in either scan in 72 (45%). Calcium scores were low in the 21/162 subjects (12%) for whom CAC was present in one but not the other scan (modified AS 0.96). However, the mean interscan variability was significantly different between mineral mass, modified AS, and VS (coefficient of variation 26 ± 19%, 41 ± 28% and 34 ± 25%, respectively; p < 0.04), with significantly smaller mean differences in pair-wise comparisons for mineral mass compared with modified AS (p < 0.002) or with VS (p < 0.03). The amount of CAC but not heart rate was an independent predictor of interscan variability (r = -0.638, -0.614 and -0.577 for AS, VS, and mineral mass, respectively; all p < 0.0001). The decreased interscan variability of mineral mass would allow a sample size reduction of 5.5% compared with modified AS for observational studies of CAC progression and for randomized clinical trials. Conclusion: There is significantly reduced interscan variability of CAC measurements with mineral mass compared with the modified AS or VS. However, the measurement variability of all quantification methods is predicted by the amount of CAC and is inversely correlated to the extent of partial

  19. Investigation of measuring strategies in computed tomography

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2011-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for non-destructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...

  20. MTA Computer Based Evaluation System.

    Science.gov (United States)

    Brenner, Lisa P.; And Others

    The MTA PLATO-based evaluation system, which has been implemented by a consortium of schools of medical technology, is designed to be general-purpose, modular, data-driven, and interactive, and to accommodate other national and local item banks. The system provides a comprehensive interactive item-banking system in conjunction with online student…

  1. Computation and measurement of calandria tube sag in PHWR

    International Nuclear Information System (INIS)

    Kim, Tae Ryong; Sohn, Seok Man

    2003-01-01

    Calandria tubes and liquid injection shutdown system (LISS) tubes in a pressurized heavy water reactor (PHWR) is known to sag due to irradiation creep and growth during plant operation. When the sag of calandria tube becomes bigger, the calandria tube possibly comes in contact with LISS tube crossing beneath and calandria tube. The contact subsequently may cause the damage on the calandria tube resulting in unpredicted outage of the plant. It is therefore necessary to check the gap between the two tubes in order to periodically confirm no contact by using a proper measure during the plant life. An ultrasonic gap measuring probe assembly which can be inserted into two viewing ports of the calandria was developed in Korea and utilized to measure the sags of both tubes in the PHWR. It was found that the centerlines of calandria tubes and liquid injection shutdown system tubes can be precisely detected by ultrasonic wave. The gaps between two tubes were easily obtained from the relative distance of the measured centerline elevations of the tubes. Based on the irradiation creep equation and the measurement data, a computer program to calculate the sags was also developed. With the computer program, the sag at the end of plant life was predicted. (author)

  2. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  3. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  4. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  5. CAMAC based computer--computer communications via microprocessor data links

    International Nuclear Information System (INIS)

    Potter, J.M.; Machen, D.R.; Naivar, F.J.; Elkins, E.P.; Simmonds, D.D.

    1976-01-01

    Communications between the central control computer and remote, satellite data acquisition/control stations at The Clinton P. Anderson Meson Physics Facility (LAMPF) is presently accomplished through the use of CAMAC based Data Link Modules. With the advent of the microprocessor, a new philosophy for digital data communications has evolved. Data Link modules containing microprocessor controllers provide link management and communication network protocol through algorithms executed in the Data Link microprocessor

  6. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  7. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  8. Computer-based diagnostic decisionmaking.

    Science.gov (United States)

    Miller, R A

    1987-12-01

    The three decisionmaking aids described by the authors attack the generic problem of "see no evil, hear no evil, speak no evil"--improving the detection, diagnosis, and therapy of psychiatric disorders in the primary care setting. The three systems represent interventions at different steps in the process of providing appropriate care to psychiatric patients. The DSPW system of Robins and Marcus offers the potential of increasing the recognition of psychiatric disease in the physician's office. Politser's IDS program is representative of the sort of sophisticated microcomputer-based decisionmaking support tools that will become available to physicians in the not-too-distant future. Erdman's study of the impact of explanation capabilities on the acceptability of therapy recommending systems points out the need for careful scientific evaluations of features added to diagnostic and therapeutic systems.

  9. Measuring the Impact of App Inventor for Android and Studio-Based Learning in an Introductory Computer Science Course for Non-Majors

    Science.gov (United States)

    Ahmad, Khuloud Nasser

    2012-01-01

    A reexamination of the traditional instruction of introductory computer science (CS) courses is becoming a necessity. Introductory CS courses tend to have high attrition rates and low success rates. In many universities, the CS department suffered from low enrollment for several years compared to other majors. Multiple studies have linked these…

  10. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  11. Efficient Computation of Popular Phylogenetic Tree Measures

    DEFF Research Database (Denmark)

    Tsirogiannis, Constantinos; Sandel, Brody Steven; Cheliotis, Dimitris

    2012-01-01

    Given a phylogenetic tree $\\mathcal{T}$ of n nodes, and a sample R of its tips (leaf nodes) a very common problem in ecological and evolutionary research is to evaluate a distance measure for the elements in R. Two of the most common measures of this kind are the Mean Pairwise Distance ($\\ensurem...

  12. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  13. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  14. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  15. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  16. Quantitative computed tomography in measurement of vertebral trabecular bone mass

    International Nuclear Information System (INIS)

    Nilsson, M.; Johnell, O.; Jonsson, K.; Redlund-Johnell, I.

    1988-01-01

    Measurement of bone mineral concentration (BMC) can be done by several modalities. Quantitative computed tomography (QCT) can be used for measurements at different sites and with different types of bone (trabecular-cortical). This study presents a modified method reducing the influence of fat. Determination of BMC was made from measurements with single-energy computed tomography (CT) of the mean Hounsfield number in the trabecular part of the L1 vertebra. The method takes into account the age-dependent composition of the trabecular part of the vertebra. As the amount of intravertebral fat increases with age, the effective atomic number for these parts decreases. This results in a non-linear calibration curve for single-energy CT. Comparison of BMC values using the non-linear calibration curve or the traditional linear calibration with those obtained with a pixel-by-pixel based electron density calculation method (theoretically better) showed results clearly in favor of the non-linear method. The material consisted of 327 patients aged 6 to 91 years, of whom 197 were considered normal. The normal data show a sharp decrease in trabecular bone after the age of 50 in women. In men a slower decrease was found. The vertebrae were larger in men than in women. (orig.)

  17. Measuring Prefered Services from Cloud Computing Providers ...

    African Journals Online (AJOL)

    pc

    2018, 10(5S), 207-212. 207. Measuring Prefered Services from ... Published online: 22 March 2018 .... and then introduces a general service selection and ranking model with QoS ..... To facilitate add, remove, and prioritize services in election.

  18. Effective Fault-Tolerant Quantum Computation with Slow Measurements

    International Nuclear Information System (INIS)

    DiVincenzo, David P.; Aliferis, Panos

    2007-01-01

    How important is fast measurement for fault-tolerant quantum computation? Using a combination of existing and new ideas, we argue that measurement times as long as even 1000 gate times or more have a very minimal effect on the quantum accuracy threshold. This shows that slow measurement, which appears to be unavoidable in many implementations of quantum computing, poses no essential obstacle to scalability

  19. Computer simulation of pitting potential measurements

    International Nuclear Information System (INIS)

    Laycock, N.J.; Noh, J.S.; White, S.P.; Krouse, D.P.

    2005-01-01

    A deterministic model for the growth of single pits in stainless steel has been combined with a purely stochastic model of pit nucleation. Monte-Carlo simulations have been used to compare the predictions of this model with potentiodynamic experimental measurements of the pitting potential. The quantitative agreement between model and experiment is reasonable for both 304 and 316 stainless steel, and the effects of varying surface roughness, solution chloride concentration and potential sweep rate have been considered

  20. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-08-27

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field ({approx}10 T) and at low temperature {approx}1 K .

  1. Solid-State Quantum Computer Based on Scanning Tunneling Microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Brown, G. W.; Hawley, M. E.; Tsifrinovich, V. I.

    2001-01-01

    We propose a solid-state nuclear-spin quantum computer based on application of scanning tunneling microscopy (STM) and well-developed silicon technology. It requires the measurement of tunneling-current modulation caused by the Larmor precession of a single electron spin. Our envisioned STM quantum computer would operate at the high magnetic field (∼10 T) and at low temperature ∼1 K

  2. Cluster-based localization and tracking in ubiquitous computing systems

    CERN Document Server

    Martínez-de Dios, José Ramiro; Torres-González, Arturo; Ollero, Anibal

    2017-01-01

    Localization and tracking are key functionalities in ubiquitous computing systems and techniques. In recent years a very high variety of approaches, sensors and techniques for indoor and GPS-denied environments have been developed. This book briefly summarizes the current state of the art in localization and tracking in ubiquitous computing systems focusing on cluster-based schemes. Additionally, existing techniques for measurement integration, node inclusion/exclusion and cluster head selection are also described in this book.

  3. Computer Based Road Accident Reconstruction Experiences

    Directory of Open Access Journals (Sweden)

    Milan Batista

    2005-03-01

    Full Text Available Since road accident analyses and reconstructions are increasinglybased on specific computer software for simulationof vehicle d1iving dynamics and collision dynamics, and forsimulation of a set of trial runs from which the model that bestdescribes a real event can be selected, the paper presents anoverview of some computer software and methods available toaccident reconstruction experts. Besides being time-saving,when properly used such computer software can provide moreauthentic and more trustworthy accident reconstruction, thereforepractical experiences while using computer software toolsfor road accident reconstruction obtained in the TransportSafety Laboratory at the Faculty for Maritime Studies andTransport of the University of Ljubljana are presented and discussed.This paper addresses also software technology for extractingmaximum information from the accident photo-documentationto support accident reconstruction based on the simulationsoftware, as well as the field work of reconstruction expertsor police on the road accident scene defined by this technology.

  4. Measurement of tibial torsion by computer tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jend, H.H.; Heller, M.; Dallek, M.; Schoettle, H. (Hamburg Univ. (Germany, F.R.))

    1981-01-01

    A CT procedure for objective measurements of tibial torsion independent of axial rotation in the nearby joints is described. Transverse sections in defined planes of the tibia permit easy calculation of normal and abnormal congenital or posttraumatic angles of torsion. In 69 limbs normal tibial torsion was 40/sup 0/+-9/sup 0/. In a series of 42 limbs with complicated healing of a fracture of both bones of the leg it is shown that tibial maltorsion is a deformity which in most cases leads to arthrosis of the ankle joint.

  5. Measurement of tibial torsion by computer tomography

    International Nuclear Information System (INIS)

    Jend, H.-H.; Heller, M.; Dallek, M.; Schoettle, H.

    1981-01-01

    A CT procedure for objective measurements of tibial torsion independent of axial rotation in the nearby joints is described. Transverse sections in defined planes of the tibia permit easy calculation of normal and abnormal congenital or posttraumatic angles of torsion. In 69 limbs normal tibial torsion was 40 0 +-9 0 . In a series of 42 limbs with complicated healing of a fracture of both bones of the leg it is shown that tibial maltorsion is a deformity which in most cases leads to arthrosis of the ankle joint. (Auth.)

  6. From computing with numbers to computing with words. From manipulation of measurements to manipulation of perceptions.

    Science.gov (United States)

    Zadeh, L A

    2001-04-01

    Interest in issues relating to consciousness has grown markedly during the last several years. And yet, nobody can claim that consciousness is a well-understood concept that lends itself to precise analysis. It may be argued that, as a concept, consciousness is much too complex to fit into the conceptual structure of existing theories based on Aristotelian logic and probability theory. An approach suggested in this paper links consciousness to perceptions and perceptions to their descriptors in a natural language. In this way, those aspects of consciousness which relate to reasoning and concept formation are linked to what is referred to as the methodology of computing with words (CW). Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language (e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc.). Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech, and summarizing a story. Underlying this remarkable capability is the brain's crucial ability to manipulate perceptions--perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood, and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions: a theory which may have an important

  7. Is a computer based measurement method superior to a recommended manual method by the ROHO® Group to assess pressure in the sitting position?

    DEFF Research Database (Denmark)

    Andreasen, Jane; Olesen, Christian Gammelgaard; Rasmussen, John

    2013-01-01

    at the Department of Occupational Therapy and Physiotherapy, Aalborg Hospital, Aarhus University Hospital, Denmark. Participants: 20 healthy and able minded aged between 18 and 65. Procedure: The outcome measures were obtained using a pressure imaging system that could register pressure distribution in the sitting...... area. The system was a XSENSOR Pressure Mapping System™. The cushion used was a Roho Quatro select® high profile. All subjects were tested twice with an interval of 24 hours by Testers 1 and 2, who were experienced occupational therapists. Main outcome measures: Risk factor defined as a scalar norm, R...

  8. Spectrophotometer-Based Color Measurements

    Science.gov (United States)

    2017-10-24

    equipment. There are several American Society for Testing and Materials ( ASTM ) chapters covering the use of spectrometers for color measurements (refs. 3...Perkin Elmer software and procedures described in ASTM chapter E308 (ref. 3). All spectral data was stored on the computer. A summary of the color...similarity, or lack thereof, between two colors (ref. 5). In this report, the Euclidean distance metric, E, is used and recommended in ASTM D2244

  9. On the complexity of computing two nonlinearity measures

    DEFF Research Database (Denmark)

    Find, Magnus Gausdal

    2014-01-01

    We study the computational complexity of two Boolean nonlinearity measures: the nonlinearity and the multiplicative complexity. We show that if one-way functions exist, no algorithm can compute the multiplicative complexity in time 2O(n) given the truth table of length 2n, in fact under the same ...

  10. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    Science.gov (United States)

    Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314

  11. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    Directory of Open Access Journals (Sweden)

    Stephen T. Polyak

    2017-11-01

    Full Text Available This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  12. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.

    Science.gov (United States)

    Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  13. Computer-Game-Based Tutoring of Mathematics

    Science.gov (United States)

    Ke, Fengfeng

    2013-01-01

    This in-situ, descriptive case study examined the potential of implementing computer mathematics games as an anchor for tutoring of mathematics. Data were collected from middle school students at a rural pueblo school and an urban Hispanic-serving school, through in-field observation, content analysis of game-based tutoring-learning interactions,…

  14. A CAMAC-based laboratory computer system

    International Nuclear Information System (INIS)

    Westphal, G.P.

    1975-01-01

    A CAMAC-based laboratory computer network is described by sharing a common mass memory this offers distinct advantages over slow and core-consuming single-processor installations. A fast compiler-BASIC, with extensions for CAMAC and real-time, provides a convenient means for interactive experiment control

  15. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  16. Computer Based Training Authors' and Designers' training

    Directory of Open Access Journals (Sweden)

    Frédéric GODET

    2016-03-01

    Full Text Available This communication, through couple of studies driven since 10 years, tries to show how important is the training of authors in Computer Based Training (CBT. We submit here an approach to prepare designers mastering Interactive Multimedia modules in this domain. Which institutions are really dedicating their efforts in training authors and designers in this area of CBTs? Television devices and broadcast organisations offered since year 60s' a first support for Distance Learning. New media, New Information and Communication Technologies (NICT allowed several public and private organisations to start Distance Learning projects. As usual some of them met their training objectives, other of them failed. Did their really failed? Currently, nobody has the right answer. Today, we do not have enough efficient tools allowing us to evaluate trainees' acquisition in a short term view. Training evaluation needs more than 10 to 20 years of elapsed time to bring reliable measures. Nevertheless, given the high investments already done in this area, we cannot wait until the final results of the pedagogical evaluation. A lot of analyses showed relevant issues which can be used as directions for CBTs authors and designers training. Warning - Our studies and the derived conclusions are mainly based on projects driven in the field. We additionally bring our several years experience in the training of movie film authors in the design of interactive multimedia products. Some of our examples are extracting from vocational training projects where we were involved in all development phases from the analysis of needs to the evaluation of the acquisition within the trainee's / employee job's. Obviously, we cannot bring and exhaustive approach in this domain where a lot of parameters are involved as frame for the CBT interactive multimedia modules authors' and designers' training.

  17. Total variation-based neutron computed tomography

    Science.gov (United States)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  18. Computer based training: Technology and trends

    International Nuclear Information System (INIS)

    O'Neal, A.F.

    1986-01-01

    Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT

  19. Demonstration of measurement-only blind quantum computing

    International Nuclear Information System (INIS)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Walther, Philip; Morimae, Tomoyuki

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks. (paper)

  20. Demonstration of measurement-only blind quantum computing

    Science.gov (United States)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Morimae, Tomoyuki; Walther, Philip

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks.

  1. Measuring the Computer-Related Self-Concept

    Science.gov (United States)

    Langheinrich, Jessica; Schönfelder, Mona; Bogner, Franz X.

    2016-01-01

    A positive self-concept supposedly affects a student's well-being as well as his or her perception of individual competence at school. As computer-based learning is becoming increasingly important in school, a positive computer-related self-concept (CSC) might help to enhance cognitive achievement. Consequently, we focused on establishing a short,…

  2. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  3. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  4. Reliability and validity of a dual-probe personal computer-based muscle viewer for measuring the pennation angle of the medial gastrocnemius muscle in patients who have had a stroke.

    Science.gov (United States)

    Cho, Ji-Eun; Cho, Ki Hun; Yoo, Jun Sang; Lee, Su Jin; Lee, Wan-Hee

    2018-01-01

    Background A dual-probe personal computer-based muscle viewer (DPC-BMW) is advantageous in that it is relatively lightweight and easy to apply. Objective To investigate the reliability and validity of the DPC-BMW in comparison with those of a portable ultrasonography (P-US) device for measuring the pennation angle of the medial gastrocnemius (MG) muscle at rest and during contraction. Methods Twenty-four patients who had a stroke (18 men and 6 women) participated in this study. Using the DPC-BMW and P-US device, the pennation angle of the MG muscle on the affected side was randomly measured. Two examiners randomly obtained the images of all the participants in two separate test sessions, 7 days apart. Intraclass correlation coefficient (ICC), confidence interval, standard error of measurement, Bland-Altman plot, and Pearson correlation coefficient were used to estimate their reliability and validity. Results The ICC for the intrarater reliability of the MG muscle pennation angle measured using the DPC-BMW was > 0.916, indicating excellent reliability, and that for the interrater reliability ranged from 0.964 to 0.994. The P-US device also exhibited good reliability. A high correlation was found between the measurements of MG muscle pennation angle obtained using the DPC-BMW and that obtained using the P-US device (p < 0.01). Conclusion The DPC-BMW can provide clear images for accurate measurements, including measurements using dual probes. It has the advantage of rehabilitative US imaging for individuals who have had a stroke. More research studies are needed to evaluate the usefulness of the DPC-BMW in rehabilitation.

  5. Measuring Emotion Regulation with Single Dry Electrode Brain Computer Interface

    NARCIS (Netherlands)

    van der Wal, C.N.; Irrmischer, M.; Guo, Y.; Friston, K.; Faisal, A.; Hill, S.; Peng, H.

    2015-01-01

    Wireless brain computer interfaces (BCI’s) are promising for new intelligent applications in which emotions are detected by measuring brain activity. Applications, such as serious games and video game therapy, are measuring and using the user’s emotional state in order to determine the intensity

  6. Computer-Based Wireless Advertising Communication System

    Directory of Open Access Journals (Sweden)

    Anwar Al-Mofleh

    2009-10-01

    Full Text Available In this paper we developed a computer based wireless advertising communication system (CBWACS that enables the user to advertise whatever he wants from his own office to the screen in front of the customer via wireless communication system. This system consists of two PIC microcontrollers, transmitter, receiver, LCD, serial cable and antenna. The main advantages of the system are: the wireless structure and the system is less susceptible to noise and other interferences because it uses digital communication techniques.

  7. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  8. Silicon CMOS architecture for a spin-based quantum computer.

    Science.gov (United States)

    Veldhorst, M; Eenink, H G J; Yang, C H; Dzurak, A S

    2017-12-15

    Recent advances in quantum error correction codes for fault-tolerant quantum computing and physical realizations of high-fidelity qubits in multiple platforms give promise for the construction of a quantum computer based on millions of interacting qubits. However, the classical-quantum interface remains a nascent field of exploration. Here, we propose an architecture for a silicon-based quantum computer processor based on complementary metal-oxide-semiconductor (CMOS) technology. We show how a transistor-based control circuit together with charge-storage electrodes can be used to operate a dense and scalable two-dimensional qubit system. The qubits are defined by the spin state of a single electron confined in quantum dots, coupled via exchange interactions, controlled using a microwave cavity, and measured via gate-based dispersive readout. We implement a spin qubit surface code, showing the prospects for universal quantum computation. We discuss the challenges and focus areas that need to be addressed, providing a path for large-scale quantum computing.

  9. On-Line Voltage Stability Assessment based on PMU Measurements

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; P. Da Silva, Luiz C.; Nielsen, Arne Hejde

    2009-01-01

    This paper presents a method for on-line monitoring of risk voltage collapse based on synchronised phasor measurement. As there is no room for intensive computation and analysis in real-time, the method is based on the combination of off-line computation and on-line monitoring, which are correlat...

  10. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  11. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  12. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  13. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  14. Computational chemistry and metal-based radiopharmaceuticals

    International Nuclear Information System (INIS)

    Neves, M.; Fausto, R.

    1998-01-01

    Computer-assisted techniques have found extensive use in the design of organic pharmaceuticals but have not been widely applied on metal complexes, particularly on radiopharmaceuticals. Some examples of computer generated structures of complexes of In, Ga and Tc with N, S, O and P donor ligands are referred. Besides parameters directly related with molecular geometries, molecular properties of the predicted structures, as ionic charges or dipole moments, are considered to be related with biodistribution studies. The structure of a series of oxo neutral Tc-biguanide complexes are predicted by molecular mechanics calculations, and their interactions with water molecules or peptide chains correlated with experimental data of partition coefficients and percentage of human protein binding. The results stress the interest of using molecular modelling to predict molecular properties of metal-based radiopharmaceuticals, which can be successfully correlated with results of in vitro studies. (author)

  15. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  16. A Computer-Based Visual Analog Scale,

    Science.gov (United States)

    1992-06-01

    34 keys on the computer keyboard or other input device. The initial position of the arrow is always in the center of the scale to prevent biasing the...3 REFERENCES 1. Gift, A.G., "Visual Analogue Scales: Measurement of Subjective Phenomena." Nursing Research, Vol. 38, pp. 286-288, 1989. 2. Ltmdberg...3. Menkes, D.B., Howard, R.C., Spears, G.F., and Cairns, E.R., "Salivary THC Following Cannabis Smoking Correlates With Subjective Intoxication and

  17. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  18. Developing and validating an instrument for measuring mobile computing self-efficacy.

    Science.gov (United States)

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  19. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  20. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  1. Measurement of mesothelioma on thoracic CT scans: A comparison of manual and computer-assisted techniques

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Oxnard, Geoffrey R.; MacMahon, Heber; Vogelzang, Nicholas J.; Kindler, Hedy L.; Kocherginsky, Masha; Starkey, Adam

    2004-01-01

    Our purpose in this study was to evaluate the variability of manual mesothelioma tumor thickness measurements in computed tomography (CT) scans and to assess the relative performance of six computerized measurement algorithms. The CT scans of 22 patients with malignant pleural mesothelioma were collected. In each scan, an initial observer identified up to three sites in each of three CT sections at which tumor thickness measurements were to be made. At each site, five observers manually measured tumor thickness through a computer interface. Three observers repeated these measurements during three separate sessions. Inter- and intra-observer variability in the manual measurement of tumor thickness was assessed. Six automated measurement algorithms were developed based on the geometric relationship between a specified measurement site and the automatically extracted lung regions. Computer-generated measurements were compared with manual measurements. The tumor thickness measurements of different observers were highly correlated (r≥0.99); however, the 95% limits of agreement for relative inter-observer difference spanned a range of 30%. Tumor thickness measurements generated by the computer algorithms also correlated highly with the average of observer measurements (r≥0.93). We have developed computerized techniques for the measurement of mesothelioma tumor thickness in CT scans. These techniques achieved varying levels of agreement with measurements made by human observers

  2. A computer-based purchase management system

    International Nuclear Information System (INIS)

    Kuriakose, K.K.; Subramani, M.G.

    1989-01-01

    The details of a computer-based purchase management system developed to meet the specific requirements of Madras Regional Purchase Unit (MRPU) is given. Howe ver it can be easily modified to meet the requirements of any other purchase department. It covers various operations of MRPU starting from indent processing to preparation of purchase orders and reminders. In order to enable timely management action and control facilities are provided to generate the necessary management information reports. The scope for further work is also discussed. The system is completely menu driven and user friendly. Appendix A and B contains the menu implemented and the sample outputs respectively. (author)

  3. Secure information transfer based on computing reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Szmoski, R.M.; Ferrari, F.A.S. [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Pinto, S.E. de S, E-mail: desouzapinto@pq.cnpq.br [Department of Physics, Universidade Estadual de Ponta Grossa, 84030-900, Ponta Grossa (Brazil); Baptista, M.S. [Institute for Complex Systems and Mathematical Biology, SUPA, University of Aberdeen, Aberdeen (United Kingdom); Viana, R.L. [Department of Physics, Universidade Federal do Parana, 81531-990, Curitiba, Parana (Brazil)

    2013-04-01

    There is a broad area of research to ensure that information is transmitted securely. Within this scope, chaos-based cryptography takes a prominent role due to its nonlinear properties. Using these properties, we propose a secure mechanism for transmitting data that relies on chaotic networks. We use a nonlinear on–off device to cipher the message, and the transfer entropy to retrieve it. We analyze the system capability for sending messages, and we obtain expressions for the operating time. We demonstrate the system efficiency for a wide range of parameters. We find similarities between our method and the reservoir computing.

  4. MEASUREMENTS AND COMPUTATIONS OF FUEL DROPLET TRANSPORT IN TURBULENT FLOWS

    Energy Technology Data Exchange (ETDEWEB)

    Joseph Katz and Omar Knio

    2007-01-10

    The objective of this project is to study the dynamics of fuel droplets in turbulent water flows. The results are essential for development of models capable of predicting the dispersion of slightly light/heavy droplets in isotropic turbulence. Since we presently do not have any experimental data on turbulent diffusion of droplets, existing mixing models have no physical foundations. Such fundamental knowledge is essential for understanding/modeling the environmental problems associated with water-fuel mixing, and/or industrial processes involving mixing of immiscible fluids. The project has had experimental and numerical components: 1. The experimental part of the project has had two components. The first involves measurements of the lift and drag forces acting on a droplet being entrained by a vortex. The experiments and data analysis associated with this phase are still in progress, and the facility, constructed specifically for this project is described in Section 3. In the second and main part, measurements of fuel droplet dispersion rates have been performed in a special facility with controlled isotropic turbulence. As discussed in detail in Section 2, quantifying and modeling the of droplet dispersion rate requires measurements of their three dimensional trajectories in turbulent flows. To obtain the required data, we have introduced a new technique - high-speed, digital Holographic Particle Image Velocimetry (HPIV). The technique, experimental setup and results are presented in Section 2. Further information is available in Gopalan et al. (2005, 2006). 2. The objectives of the numerical part are: (1) to develop a computational code that combines DNS of isotropic turbulence with Lagrangian tracking of particles based on integration of a dynamical equation of motion that accounts for pressure, added mass, lift and drag forces, (2) to perform extensive computations of both buoyant (bubbles) and slightly buoyant (droplets) particles in turbulence conditions

  5. Strain measurement based battery testing

    Science.gov (United States)

    Xu, Jeff Qiang; Steiber, Joe; Wall, Craig M.; Smith, Robert; Ng, Cheuk

    2017-05-23

    A method and system for strain-based estimation of the state of health of a battery, from an initial state to an aged state, is provided. A strain gauge is applied to the battery. A first strain measurement is performed on the battery, using the strain gauge, at a selected charge capacity of the battery and at the initial state of the battery. A second strain measurement is performed on the battery, using the strain gauge, at the selected charge capacity of the battery and at the aged state of the battery. The capacity degradation of the battery is estimated as the difference between the first and second strain measurements divided by the first strain measurement.

  6. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  7. GPU-based cone beam computed tomography.

    Science.gov (United States)

    Noël, Peter B; Walczak, Alan M; Xu, Jinhui; Corso, Jason J; Hoffmann, Kenneth R; Schafer, Sebastian

    2010-06-01

    The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  8. Measurement of liver volume by emission computed tomography

    International Nuclear Information System (INIS)

    Kan, M.K.; Hopkins, G.B.

    1979-01-01

    In 22 volunteers without clinical or laboratory evidence of liver disease, liver volume was determined using single-photon emission computed tomography (ECT). This technique provided excellent object contrast between the liver and its surroundings and permitted calculation of liver volume without geometric assumptions about the liver's configuration. Reproducibility of results was satisfactory, with a root-mean-square error of less than 6% between duplicate measurements in 15 individuals. The volume measurements were validated by the use of phantoms

  9. Regulation of flow computers for the measurement of biofuels

    Science.gov (United States)

    Almeida, R. O.; Aguiar Júnior, E. A.; Costa-Felix, R. P. B.

    2018-03-01

    This article aims to discuss the need to develop a standard or regulation applicable to flow computers in the measurement of biofuels. International standards and recommendations are presented which are possibly adequate to fill this gap and at the end of the article a way is proposed to obtain a single document on the subject.

  10. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  11. Measurement of normal ocular volume by the use of computed ...

    African Journals Online (AJOL)

    Background: Reduction or increase in ocular volume may indicate ocular pathology. Unfortunately the reference values utilized for ocular volume had been that of non-Africans. It is therefore pertinent to have a reference value of normal for Africans. Objective: To document the computer tomography (CT) scan measured ...

  12. CFD computations of the second round of MEXICO rotor measurements

    DEFF Research Database (Denmark)

    Sørensen, Niels N.; Zahle, Frederik; Boorsma, K.

    2016-01-01

    A comparison, between selected wind tunnel data from the NEW MEXICO measuring campaign and CFD computations are shown. The present work, documents that a state of the art CFD code, including a laminar turbulent transition model, can provide good agreement with experimental data. Good agreement...

  13. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  14. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  15. Computer Animation Based on Particle Methods

    Directory of Open Access Journals (Sweden)

    Rafal Wcislo

    1999-01-01

    Full Text Available The paper presents the main issues of a computer animation of a set of elastic macroscopic objects based on the particle method. The main assumption of the generated animations is to achieve very realistic movements in a scene observed on the computer display. The objects (solid bodies interact mechanically with each other, The movements and deformations of solids are calculated using the particle method. Phenomena connected with the behaviour of solids in the gravitational field, their defomtations caused by collisions and interactions with the optional liquid medium are simulated. The simulation ofthe liquid is performed using the cellular automata method. The paper presents both simulation schemes (particle method and cellular automata rules an the method of combining them in the single animation program. ln order to speed up the execution of the program the parallel version based on the network of workstation was developed. The paper describes the methods of the parallelization and it considers problems of load-balancing, collision detection, process synchronization and distributed control of the animation.

  16. Measurements by activation foils and comparative computations by MCNP code

    International Nuclear Information System (INIS)

    Kyncl, J.

    2008-01-01

    Systematic study of the radioactive waste minimisation problem is subject of the SPHINX project. Its idea is that burning or transmutation of the waste inventory problematic part will be realized in a nuclear reactor the fuel of which is in the form of liquid fluorides. In frame of the project, several experiments have been performed with so-called inserted experimental channel. The channel was filled up by the fluorides mixture, surrounded by six fuel assemblies with moderator and placed into LR-0 reactor vessel. This formation was brought to critical state and measurement with activation foil detectors were carried out at selected positions of the inserted channel. Main aim of the measurements was to determine reaction rates for the detectors mentioned. For experiment evaluation, comparative computations were accomplished by code MCNP4a. The results obtained show that very often, computed values of reaction rates differ substantially from the values that were obtained from the experiment. This contribution deals with analysis of the reasons of these differences from the point of view of computations by Monte Carlo method. The analysis of concrete cases shows that the inaccuracy of reaction rate computed is caused mostly by three circumstances:-space region that is occupied by detector is relatively very small;- microscopic effective cross-section R(E) of the reaction changes strongly with energy just in the energy interval that gives the greatest contribution to the reaction; - in the energy interval that gives the greatest contribution to reaction rate, the error of the computed neutron flux is great. These circumstances evoke that the computation of reaction rate with casual accuracy submits extreme demands on computing time. (Author)

  17. Computer-supported resolution of measurement conflicts: a case-study in materials science

    NARCIS (Netherlands)

    de Jong, Hidde; Mars, Nicolaas; van der Vet, P.E.

    1999-01-01

    Resolving conflicts between different measurements ofa property of a physical system may be a key step in a discovery process. With the emergence of large-scale databases and knowledge bases with property measurements, computer support for the task of conflict resolution has become highly desirable.

  18. 128 slice computed tomography dose profile measurement using thermoluminescent dosimeter

    International Nuclear Information System (INIS)

    Salehhon, N; Hashim, S; Karim, M K A; Ang, W C; Musa, Y; Bahruddin, N A

    2017-01-01

    The increasing use of computed tomography (CT) in clinical practice marks the needs to understand the dose descriptor and dose profile. The purposes of the current study were to determine the CT dose index free-in-air (CTDI air ) in 128 slice CT scanner and to evaluate the single scan dose profile (SSDP). Thermoluminescent dosimeters (TLD-100) were used to measure the dose profile of the scanner. There were three sets of CT protocols where the tube potential (kV) setting was manipulated for each protocol while the rest of parameters were kept constant. These protocols were based from routine CT abdominal examinations for male adult abdomen. It was found that the increase of kV settings made the values of CTDI air increased as well. When the kV setting was changed from 80 kV to 120 kV and from 120 kV to 140 kV, the CTDI air values were increased as much as 147.9% and 53.9% respectively. The highest kV setting (140 kV) led to the highest CTDI air value (13.585 mGy). The p -value of less than 0.05 indicated that the results were statistically different. The SSDP showed that when the kV settings were varied, the peak sharpness and height of Gaussian function profiles were affected. The full width at half maximum (FWHM) of dose profiles for all protocols were coincided with the nominal beam width set for the measurements. The findings of the study revealed much information on the characterization and performance of 128 slice CT scanner. (paper)

  19. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  20. Computer-based training at Sellafield

    International Nuclear Information System (INIS)

    Cartmell, A.; Evans, M.C.

    1986-01-01

    British Nuclear Fuel Limited (BNFL) operate the United Kingdom's spent-fuel receipt, storage, and reprocessing complex at Sellafield. Spent fuel from graphite-moderated CO 2 -cooled Magnox reactors has been reprocessed at Sellafield for 22 yr. Spent fuel from light water and advanced gas reactors is stored pending reprocessing in the Thermal Oxide Reprocessing Plant currently being constructed. The range of knowledge and skills needed for plant operation, construction, and commissioning represents a formidable training requirement. In addition, employees need to be acquainted with company practices and procedures. Computer-based training (CBT) is expected to play a significant role in this process. In this paper, current applications of CBT to the filed of nuclear criticality safety are described and plans for the immediate future are outlined

  1. Computer based training for oil spill management

    International Nuclear Information System (INIS)

    Goodman, R.

    1993-01-01

    Large oil spills are infrequent occurrences, which poses a particular problem for training oil spill response staff and for maintaining a high level of response readiness. Conventional training methods involve table-top simulations to develop tactical and strategic response skills and boom-deployment exercises to maintain operational readiness. Both forms of training are quite effective, but they are very time-consuming to organize, are expensive to conduct, and tend to become repetitious. To provide a variety of response experiences, a computer-based system of oil spill response training has been developed which can supplement a table-top training program. Using a graphic interface, a realistic and challenging computerized oil spill response simulation has been produced. Integral to the system is a program editing tool which allows the teacher to develop a custom training exercise for the area of interest to the student. 1 ref

  2. A High Performance COTS Based Computer Architecture

    Science.gov (United States)

    Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland

    2014-08-01

    Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.

  3. Computer Networks as a New Data Base.

    Science.gov (United States)

    Beals, Diane E.

    1992-01-01

    Discusses the use of communication on computer networks as a data source for psychological, social, and linguistic research. Differences between computer-mediated communication and face-to-face communication are described, the Beginning Teacher Computer Network is discussed, and examples of network conversations are appended. (28 references) (LRW)

  4. Quantum computing based on semiconductor nanowires

    NARCIS (Netherlands)

    Frolov, S.M.; Plissard, S.R.; Nadj-Perge, S.; Kouwenhoven, L.P.; Bakkers, E.P.A.M.

    2013-01-01

    A quantum computer will have computational power beyond that of conventional computers, which can be exploited for solving important and complex problems, such as predicting the conformations of large biological molecules. Materials play a major role in this emerging technology, as they can enable

  5. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  6. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  7. A study on measurement of scattery ray of computed tomography

    International Nuclear Information System (INIS)

    Cho, Pyong Kon; Lee, Joon Hyup; Kim, Yoon Sik; Lee, Chang Yeop

    2003-01-01

    Computed tomographic equipment is essential for diagnosis by means of radiation. With passage of time and development of science computed tomographic was developed time and again and in future examination by means of this equipment is expected to increase. In this connection these authors measured rate of scatter ray generation at front of lead glass for patients within control room of computed tomographic equipment room and outside of entrance door for exit and entrance of patients and attempted to find out method for minimizing exposure to scatter ray. From November 2001 twenty five units of computed tomographic equipment which were already installed and operation by 13 general hospitals and university hospitals in Seoul were subjected to this study. As condition of photographing those recommended by manufacturer for measuring exposure to scatter ray was use. At the time objects used DALI CT Radiation Dose Test Phantom fot Head (φ 16 cm Plexglas) and Phantom for Stomache (φ 32 cm Plexglas) were used. For measurement of scatter ray Reader (Radiation Monitor Controller Model 2026) and G-M Survey were used to Survey Meter of Radical Corporation, model 20 x 5-1800, Electrometer/Ion Chamber, S/N 21740. Spots for measurement of scatter ray included front of lead glass for patients within control room of computed tomographic equipment room which is place where most of work by gradiographic personnel are carried out and is outside of entrance door for exit and entrance of patients and their guardians and at spot 100 cm off from isocenter at the time of scanning the object. Work environment within computed tomography room which was installed and under operation by each hospital showed considerable difference depending on circumstances of pertinent hospitals and status of scatter ray was as follows. 1) From isocenter of computed tomographic equipment to lead glass for patients within control room average distance was 377 cm. At that time scatter ray showed diverse

  8. Quantitative computed tomography for measuring bone mineral content

    International Nuclear Information System (INIS)

    Felsenberg, D.; Kalender, W.A.; Banzer, D.; Schmilinsky, G.; Heyse, M.; Fischer, E.; Schneider, U.; Siemens A.G., Erlangen; Krankenhaus Zehlendorf, Berlin

    1988-01-01

    Quantitative computed tomography (QCT) for measuring bone mineral content of lumbar vertebrae is increasingly used internationally. The effect of using conventional CT (single energy CT, SE-CT) and dual energy CT (DE-CT) on reproducibility has been examined. We defined a standard measurement protocol, which automatically evaluates a calibration phantom. This should ensure an in vivo reproducibility of 1 to 2%. Reference data, which has been obtained with this protocol from 113 normal subjects, using SE-CT ad DE-CT, are presented. (orig.) [de

  9. Computer controlled scanning systems for quantitative track measurements

    International Nuclear Information System (INIS)

    Gold, R.; Roberts, J.H.; Preston, C.C.; Ruddy, F.H.

    1982-01-01

    The status of three computer cntrolled systems for quantitative track measurements is described. Two systems, an automated optical track scanner (AOTS) and an automated scanning electron microscope (ASEM) are used for scanning solid state track recorders (SSTR). The third system, the emulsion scanning processor (ESP), is an interactive system used to measure the length of proton tracks in nuclear research emulsions (NRE). Recent advances achieved with these systems are presented, with emphasis placed upon the current limitation of these systems for reactor neutron dosimetry

  10. Novel computer-based endoscopic camera

    Science.gov (United States)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  11. Activation method for measuring the neutron spectra parameters. Computer software

    International Nuclear Information System (INIS)

    Efimov, B.V.; Ionov, V.S.; Konyaev, S.I.; Marin, S.V.

    2005-01-01

    The description of mathematical statement of a task for definition the spectral characteristics of neutron fields with use developed in RRC KI unified activation detectors (UKD) is resulted. The method of processing of results offered by authors activation measurements and calculation of the parameters used for an estimation of the neutron spectra characteristics is discussed. Features of processing of the experimental data received at measurements of activation with using UKD are considered. Activation detectors UKD contain a little bit specially the picked up isotopes giving at irradiation peaks scale of activity in the common spectrum scale of activity. Computing processing of results of the measurements is applied on definition of spectrum parameters for nuclear reactor installations with thermal and close to such power spectrum of neutrons. The example of the data processing, the measurements received at carrying out at RRC KI research reactor F-1 is resulted [ru

  12. Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2010-11-01

    Full Text Available Computer technology has provided language testing experts with opportunity to develop computerized versions of traditional paper-based language tests. New generations of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment ( e.g. modes of test delivery, familiarity with computer, etc.,the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups with high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

  13. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  14. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  15. Milestones Toward Majorana-Based Quantum Computing

    Directory of Open Access Journals (Sweden)

    David Aasen

    2016-08-01

    Full Text Available We introduce a scheme for preparation, manipulation, and read out of Majorana zero modes in semiconducting wires with mesoscopic superconducting islands. Our approach synthesizes recent advances in materials growth with tools commonly used in quantum-dot experiments, including gate control of tunnel barriers and Coulomb effects, charge sensing, and charge pumping. We outline a sequence of milestones interpolating between zero-mode detection and quantum computing that includes (1 detection of fusion rules for non-Abelian anyons using either proximal charge sensors or pumped current, (2 validation of a prototype topological qubit, and (3 demonstration of non-Abelian statistics by braiding in a branched geometry. The first two milestones require only a single wire with two islands, and additionally enable sensitive measurements of the system’s excitation gap, quasiparticle poisoning rates, residual Majorana zero-mode splittings, and topological-qubit coherence times. These pre-braiding experiments can be adapted to other manipulation and read out schemes as well.

  16. Dimensional measurement of micro-moulded parts by computed tomography

    DEFF Research Database (Denmark)

    Ontiveros, S.; Yagüe-Fabra, J.A.; Jiménez, R.

    2012-01-01

    Computed tomography (CT) is progressively assuming an important role in metrology applications and great efforts are being made in order to turn it into a reliable and standardized measuring technology. CT is typically used for non-destructive tests, but it is currently becoming very popular for ...... and the analysis of the results provide valuable conclusions about the advantages and drawbacks of using CT metrology in comparison with other measuring systems when these techniques are employed for the quality control of micro-moulded parts.......Computed tomography (CT) is progressively assuming an important role in metrology applications and great efforts are being made in order to turn it into a reliable and standardized measuring technology. CT is typically used for non-destructive tests, but it is currently becoming very popular...... for dimensional metrology applications due to its strategic advantages such as the capability of performing measurements on both the component's surface and volume, allowing inspection possibilities to otherwise non-accessible internal features. This paper focuses on the dimensional verification of two micro...

  17. Computer-Based Cognitive Training in Aging.

    Science.gov (United States)

    Klimova, Blanka

    2016-01-01

    At present there is a rapid growth of aging population groups worldwide, which brings about serious economic and social problems. Thus, there is considerable effort to prolong the active life of these older people and keep them independent. The purpose of this mini review is to explore available clinical studies implementing computer-based cognitive training programs as intervention tools in the prevention and delay of cognitive decline in aging, with a special focus on their effectiveness. This was done by conducting a literature search in the databases Web of Science, Scopus, MEDLINE and Springer, and consequently by evaluating the findings of the relevant studies. The findings show that computerized cognitive training can lead to the improvement of cognitive functions such as working memory and reasoning skills in particular. However, this training should be performed over a longer time span since a short-term cognitive training mainly has an impact on short-term memory with temporary effects. In addition, the training must be intense to become effective. Furthermore, the results indicate that it is important to pay close attention to the methodological standards in future clinical studies.

  18. On the Use of Presence Measurements to Evaluate Computer Games

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Korsgaard, Dannie

    2008-01-01

    As the game industry expresses a growing demand for effective evaluation methods, it is worth investigating if the commonly used questionnaires can be replaced by alternative ways of measuring user experience in interactive environments. This paper describes an experiment where an existing presence...... measurement method is modified for use in computer game development. 39 subjects were part of the experiment, which was designed to test applicability of the adapted presence measuring method. Besides playing a game prototype, test participants were asked to press a button when a visual signal, triggered...... by an in-game event, would appear on the screen in the periphery of sight. Noting how strong the signal was is assumed to infer how strong the stimuli had to be in order to break the immersive presence. The results indicated that the adapted method with observations from the test is more useful, than...

  19. Pulmonary blood flow distribution measured by radionuclide computed tomography

    International Nuclear Information System (INIS)

    Maeda, H.; Itoh, H.; Ishii, Y.

    1982-01-01

    Distributions of pulmonary blood flow per unit lung volume were measured in sitting patients with a radionuclide computed tomography (RCT) by intravenously administered Tc-99m macroaggregates of human serum albumin (MAA). Four different types of distribution were distinguished, among which a group referred as type 2 had a three zonal blood flow distribution as previously reported (West and co-workers, 1964). The pulmonary arterial pressure (Pa) and the venous pressure (Pv) were determined in this group of distribution. These values showed satifactory agreements with the pulmonary artery pressure (Par) and the capillary wedged pressure (Pcw) measured by Swan-Ganz catheter in eighteen supine patients. Those good correlations enable to establish a noninvasive methodology for measurement of pulmonary vascular pressures

  20. Density gradients in ceramic pellets measured by computed tomography

    International Nuclear Information System (INIS)

    Sawicka, B.D.; Palmer, B.J.F.

    1986-07-01

    Density gradients are of fundamental importance in ceramic processing and computed tomography (CT) can provide accurate measurements of density profiles in sintered and unsintered ceramic parts. As a demonstration of this potential, the density gradients in an unsintered pellet pressed from an alumina powder were measured by CT scanning. To detect such small density gradients, the CT images must have good density resolution and be free from beam-hardening effects. This was achieved by measuring high-contrast (low-noise) images with the use of an Ir-192 isotopic source. A beam-hardening correction was applied. The resulting images are discussed relative to the transmission of forces through the powder mass during the pelletizing process

  1. A quantum computer based on recombination processes in microelectronic devices

    International Nuclear Information System (INIS)

    Theodoropoulos, K; Ntalaperas, D; Petras, I; Konofaos, N

    2005-01-01

    In this paper a quantum computer based on the recombination processes happening in semiconductor devices is presented. A 'data element' and a 'computational element' are derived based on Schokley-Read-Hall statistics and they can later be used to manifest a simple and known quantum computing process. Such a paradigm is shown by the application of the proposed computer onto a well known physical system involving traps in semiconductor devices

  2. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  3. Intrinsic measurement bias on computed tomography scout view is unpredictable: computed tomography pelvimetry using a phantom

    International Nuclear Information System (INIS)

    Anderson, N.G.; Fenwick, J.L.; Wells, J.E.

    2006-01-01

    Our aim was to determine the degree of bias in CT scanogram measurements. We obtained standard lateral and anteroposterior (AP) pelvimetry scanograms of a phantom pelvis after placing ball bearings or aluminium rods to mark bony landmarks. Computed tomography pelvimetry was carried out at the manufacturer-recommended table height on two commercial CT scanners and at 10-mm increments up to 50 mm above and below this height. The AP inlet, AP outlet, interspinous distance and transverse diameters were each measured three times for each scanogram. The true measurements were obtained directly from the disassembled phantom. Bias was defined as the difference between the CT measurement and the true measurement. Observer error was negligible. The transverse diameter was overestimated at high table positions and underestimated at low table positions on both scanners (+6 to -10 mm). After correcting for geometric distortion, up to 6 mm bias was still present. The point at which no bias occurred was different for each scanner and did not correspond to the manufacturers' recommended table height. The outlet was overestimated on both scanners by up to 5 mm. The true inlet measurement was overestimated by 1.2 mm. The interspinous distance was minimally underestimated on both scanners. The measurements on CT scanogram were underestimated or overestimated in an inconsistent and unpredictable fashion, varying from one type of measurement to another and from CT scanner to CT scanner. This has implications for the accuracy and clinical utility of measurements obtained from a CT scanogram. Copyright (2006) Blackwell Science Pty Ltd

  4. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  5. Solid-state nuclear-spin quantum computer based on magnetic resonance force microscopy

    International Nuclear Information System (INIS)

    Berman, G. P.; Doolen, G. D.; Hammel, P. C.; Tsifrinovich, V. I.

    2000-01-01

    We propose a nuclear-spin quantum computer based on magnetic resonance force microscopy (MRFM). It is shown that an MRFM single-electron spin measurement provides three essential requirements for quantum computation in solids: (a) preparation of the ground state, (b) one- and two-qubit quantum logic gates, and (c) a measurement of the final state. The proposed quantum computer can operate at temperatures up to 1 K. (c) 2000 The American Physical Society

  6. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    Science.gov (United States)

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  7. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    plutonium in a variety of waste types contained in 208-ell drums measured by the passive active neutron (PAN) radioassay system at the Idaho National Engineering and Environmental Laboratory (INEEL). Computer simulation of the PAN system performance uses the Monte Carlo N-Particle (MCNP) code to produce a neutron transport calculation for a simulated waste drum. A followup program was written to combine the MCNP output with other parameters generated by the modeling process to yield simulated measured plutonium mass values. The accuracy of the simulations is verified using surrogate waste drums with known contents

  8. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    Science.gov (United States)

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  9. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  10. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  11. Small Computer Applications for Base Supply.

    Science.gov (United States)

    1984-03-01

    research on small computer utili- zation at bse level organizatins , This research effort studies whether small computers and commercial softure can assist...Doe has made !solid contributions to the full range of departmental activity. His demonstrated leadership skills and administrative ability warrent his...outstanding professionalism and leadership abilities were evidenced by his superb performance as unit key worker In the 1980 Combined Federal CauMign

  12. Detailed comparison between computed and measured FBR core seismic responses

    International Nuclear Information System (INIS)

    Forni, M.; Martelli, A.; Melloni, R.; Bonacina, G.

    1988-01-01

    This paper presents a detailed comparison between seismic calculations and measurements performed for various mock-ups consisting of groups of seven and nineteen simplified elements of the Italian PEC fast reactor core. Experimental tests had been performed on shaking tables in air and water (simulating sodium) with excitations increasing up to above Safe Shutdown Earthquake. The PEC core-restraint ring had been simulated in some tests. All the experimental tests have been analysed by use of both the one-dimensional computer program CORALIE and the two-dimensional program CLASH. Comparisons have been made for all the instrumented elements, in both the time and the frequency domains. The good agreement between calculations and measurements has confirmed adequacy of the fluid-structure interaction model used for PEC core seismic design verification

  13. Sex estimation from sternal measurements using multidetector computed tomography.

    Science.gov (United States)

    Ekizoglu, Oguzhan; Hocaoglu, Elif; Inci, Ercan; Bilgili, Mustafa Gokhan; Solmaz, Dilek; Erdil, Irem; Can, Ismail Ozgur

    2014-12-01

    We aimed to show the utility and reliability of sternal morphometric analysis for sex estimation.Sex estimation is a very important step in forensic identification. Skeletal surveys are main methods for sex estimation studies. Morphometric analysis of sternum may provide high accuracy rated data in sex discrimination. In this study, morphometric analysis of sternum was evaluated in 1 mm chest computed tomography scans for sex estimation. Four hundred forty 3 subjects (202 female, 241 male, mean age: 44 ± 8.1 [distribution: 30-60 year old]) were included the study. Manubrium length (ML), mesosternum length (2L), Sternebra 1 (S1W), and Sternebra 3 (S3W) width were measured and also sternal index (SI) was calculated. Differences between genders were evaluated by student t-test. Predictive factors of sex were determined by discrimination analysis and receiver operating characteristic (ROC) analysis. Male sternal measurement values are significantly higher than females (P discrimination analysis, MSL has high accuracy rate with 80.2% in females and 80.9% in males. MSL also has the best sensitivity (75.9%) and specificity (87.6%) values. Accuracy rates were above 80% in 3 stepwise discrimination analysis for both sexes. Stepwise 1 (ML, MSL, S1W, S3W) has the highest accuracy rate in stepwise discrimination analysis with 86.1% in females and 83.8% in males. Our study showed that morphometric computed tomography analysis of sternum might provide important information for sex estimation.

  14. 26 CFR 1.809-10 - Computation of equity base.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Computation of equity base. 1.809-10 Section 1... (CONTINUED) INCOME TAXES Gain and Loss from Operations § 1.809-10 Computation of equity base. (a) In general. For purposes of section 809, the equity base of a life insurance company includes the amount of any...

  15. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  16. Inverse kinetics equations for on line measurement of reactivity using personal computer

    International Nuclear Information System (INIS)

    Ratemi, Wajdi; El Gadamsi, Walied; Beleid, Abdul Kariem

    1993-01-01

    Computer with their astonishing speed of calculations along with their easy connection to real systems, are very appropriate for digital measurements of real system variables. In the nuclear industry, such computer application will produce compact control rooms of real power plants, where information and results display can be obtained through push button concept. In our study, we use two personal computers for the purpose of simulation and measurement. One of them is used as a digital simulator to a real reactor, where we effectively simulate the reactor power through a cross talk network. The computed power is passed at certain chosen sampling time to the other computer. The purpose of the other computer is to use the inverse kinetics equations to calculate the reactivity parameter based on the received power and then it performs on line display of the power curve and the reactivity curve using color graphics. In this study, we use the one group version of the inverse kinetics algorithm which can easily be extended to larger group version. The language of programming used in Turbo BASIC, which is very comparable, in terms of efficiency, to FORTRAN language, besides its effective graphics routines. With the use of the extended version of the Inverse Kinetics algorithm, we can effectively apply this techniques of measurement for the purpose of on line display of the reactivity of the Tajoura Research Reactor. (author)

  17. Evaluation of computer-based ultrasonic inservice inspection systems

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems

  18. Projection computation based on pixel in simultaneous algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Wang Xu; Chen Zhiqiang; Xiong Hua; Zhang Li

    2005-01-01

    SART is an important arithmetic of image reconstruction, in which the projection computation takes over half of the reconstruction time. An efficient way to compute projection coefficient matrix together with memory optimization is presented in this paper. Different from normal method, projection lines are located based on every pixel, and the following projection coefficient computation can make use of the results. Correlation of projection lines and pixels can be used to optimize the computation. (authors)

  19. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  20. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  1. Simulation-based artifact correction (SBAC) for metrological computed tomography

    Science.gov (United States)

    Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc

    2017-06-01

    Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.

  2. Computer-based visual communication in aphasia.

    Science.gov (United States)

    Steele, R D; Weinrich, M; Wertz, R T; Kleczewska, M K; Carlson, G S

    1989-01-01

    The authors describe their recently developed Computer-aided VIsual Communication (C-VIC) system, and report results of single-subject experimental designs probing its use with five chronic, severely impaired aphasic individuals. Studies replicate earlier results obtained with a non-computerized system, demonstrate patient competence with the computer implementation, extend the system's utility, and identify promising areas of application. Results of the single-subject experimental designs clarify patients' learning, generalization, and retention patterns, and highlight areas of performance difficulties. Future directions for the project are indicated.

  3. A brain computer interface-based explorer.

    Science.gov (United States)

    Bai, Lijuan; Yu, Tianyou; Li, Yuanqing

    2015-04-15

    In recent years, various applications of brain computer interfaces (BCIs) have been studied. In this paper, we present a hybrid BCI combining P300 and motor imagery to operate an explorer. Our system is mainly composed of a BCI mouse, a BCI speller and an explorer. Through this system, the user can access his computer and manipulate (open, close, copy, paste, and delete) files such as documents, pictures, music, movies and so on. The system has been tested with five subjects, and the experimental results show that the explorer can be successfully operated according to subjects' intentions. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Fail-safe computer-based plant protection systems

    International Nuclear Information System (INIS)

    Keats, A.B.

    1983-01-01

    A fail-safe mode of operation for computers used in nuclear reactor protection systems was first evolved in the UK for application to a sodium cooled fast reactor. The fail-safe properties of both the hardware and the software were achieved by permanently connecting test signals to some of the multiplexed inputs. This results in an unambiguous data pattern, each time the inputs are sequentially scanned by the multiplexer. The ''test inputs'' simulate transient excursions beyond defined safe limits. The alternating response of the trip algorithms to the ''out-of-limits'' test signals and the normal plant measurements is recognised by hardwired pattern recognition logic external to the computer system. For more general application to plant protection systems, a ''Test Signal Generator'' (TSG) is used to compute and generate test signals derived from prevailing operational conditions. The TSG, from its knowledge of the sensitivity of the trip algorithm to each of the input variables, generates a ''test disturbance'' which is superimposed upon each variable in turn, to simulate a transient excursion beyond the safe limits. The ''tripped'' status yielded by the trip algorithm when using data from a ''disturbed'' input forms part of a pattern determined by the order in which the disturbances are applied to the multiplexer inputs. The data pattern formed by the interleaved test disturbances is again recognised by logic external to the protection system's computers. This fail-safe mode of operation of computer-based protection systems provides a powerful defence against common-mode failure. It also reduces the importance of software verification in the licensing procedure. (author)

  5. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2013-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  6. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    Lasserre, J.B.; Laurent, M.; Mourrain, B.; Rostalski, P.; Trébuchet, P.

    2013-01-01

    In this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming its complex (resp. real) variety is finite. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and semi-definite

  7. Moment matrices, border bases and radical computation

    NARCIS (Netherlands)

    B. Mourrain; J.B. Lasserre; M. Laurent (Monique); P. Rostalski; P. Trebuchet (Philippe)

    2011-01-01

    htmlabstractIn this paper, we describe new methods to compute the radical (resp. real radical) of an ideal, assuming it complex (resp. real) variety is nte. The aim is to combine approaches for solving a system of polynomial equations with dual methods which involve moment matrices and

  8. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  9. Efficient GPU-based skyline computation

    DEFF Research Database (Denmark)

    Bøgh, Kenneth Sejdenfaden; Assent, Ira; Magnani, Matteo

    2013-01-01

    The skyline operator for multi-criteria search returns the most interesting points of a data set with respect to any monotone preference function. Existing work has almost exclusively focused on efficiently computing skylines on one or more CPUs, ignoring the high parallelism possible in GPUs. In...

  10. RECENT THREATS TO CLOUD COMPUTING DATA AND ITS PREVENTION MEASURES

    OpenAIRE

    Rahul Neware*

    2017-01-01

    As the cloud computing is expanding day by day due to its benefits like Cost, Speed Global Scale, Productivity, Performance, Reliability etc. Everyone, like Business vendors, governments etc are using the cloud computing to grow fast. Although Cloud Computing has above mentioned and other benefits but security of cloud is problems and due to this security problem adoption of cloud computing is not growing. This paper gives information about recent threats to the cloud computing data and its p...

  11. Adaptive phase measurements in linear optical quantum computation

    International Nuclear Information System (INIS)

    Ralph, T C; Lund, A P; Wiseman, H M

    2005-01-01

    Photon counting induces an effective non-linear optical phase shift in certain states derived by linear optics from single photons. Although this non-linearity is non-deterministic, it is sufficient in principle to allow scalable linear optics quantum computation (LOQC). The most obvious way to encode a qubit optically is as a superposition of the vacuum and a single photon in one mode-so-called 'single-rail' logic. Until now this approach was thought to be prohibitively expensive (in resources) compared to 'dual-rail' logic where a qubit is stored by a photon across two modes. Here we attack this problem with real-time feedback control, which can realize a quantum-limited phase measurement on a single mode, as has been recently demonstrated experimentally. We show that with this added measurement resource, the resource requirements for single-rail LOQC are not substantially different from those of dual-rail LOQC. In particular, with adaptive phase measurements an arbitrary qubit state α vertical bar 0>+β vertical bar 1> can be prepared deterministically

  12. A computational technique to measure fracture callus in radiographs.

    Science.gov (United States)

    Lujan, Trevor J; Madey, Steven M; Fitzpatrick, Dan C; Byrd, Gregory D; Sanderson, Jason M; Bottlang, Michael

    2010-03-03

    Callus formation occurs in the presence of secondary bone healing and has relevance to the fracture's mechanical environment. An objective image processing algorithm was developed to standardize the quantitative measurement of periosteal callus area in plain radiographs of long bone fractures. Algorithm accuracy and sensitivity were evaluated using surrogate models. For algorithm validation, callus formation on clinical radiographs was measured manually by orthopaedic surgeons and compared to non-clinicians using the algorithm. The algorithm measured the projected area of surrogate calluses with less than 5% error. However, error will increase when analyzing very small areas of callus and when using radiographs with low image resolution (i.e. 100 pixels per inch). The callus size extracted by the algorithm correlated well to the callus size outlined by the surgeons (R2=0.94, p<0.001). Furthermore, compared to clinician results, the algorithm yielded results with five times less inter-observer variance. This computational technique provides a reliable and efficient method to quantify secondary bone healing response. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Clinical significance of measurement of hepatic volume by computed tomography

    International Nuclear Information System (INIS)

    Sato, Hiroyuki; Matsuda, Yoshiro; Takada, Akira

    1984-01-01

    Hepatic volumes were measured by computed tomography (CT) in 91 patients with chronic liver diseases. Mean hepatic volume in alcoholic liver disease was significantly larger than that in non-alcoholic liver disease. Hepatic volumes in the majority of decompensated liver cirrhosis were significantly smaller than those of compensated liver cirrhosis. In liver cirrhosis, significant correlations between hepatic volume and various hepatic tests which reflect the total functioning hepatic cell masses were found. Combinations of hepatic volume with ICG maximum removal rate and with serum cholinesterase activity were most useful for the assessment of prognosis in liver cirrhosis. These results indicated that estimation of hepatic volume by CT is useful for analysis of pathophysiology and prognosis of chronic liver diseases, and for diagnosis of alcoholic liver diseases. (author)

  14. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  15. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  16. Basicities of Strong Bases in Water: A Computational Study

    OpenAIRE

    Kaupmees, Karl; Trummal, Aleksander; Leito, Ivo

    2014-01-01

    Aqueous pKa values of strong organic bases – DBU, TBD, MTBD, different phosphazene bases, etc – were computed with CPCM, SMD and COSMO-RS approaches. Explicit solvent molecules were not used. Direct computations and computations with reference pKa values were used. The latter were of two types: (1) reliable experimental aqueous pKa value of a reference base with structure similar to the investigated base or (2) reliable experimental pKa value in acetonitrile of the investigated base itself. ...

  17. Computer-based learning for the enhancement of breastfeeding ...

    African Journals Online (AJOL)

    In this study, computer-based learning (CBL) was explored in the context of breastfeeding training for undergraduate Dietetic students. Aim: To adapt and validate an Indian computer-based undergraduate breastfeeding training module for use by South African undergraduate Dietetic students. Methods and materials: The ...

  18. Women and Computer Based Technologies: A Feminist Perspective.

    Science.gov (United States)

    Morritt, Hope

    The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…

  19. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  20. AI tools in computer based problem solving

    Science.gov (United States)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  1. Personal computer based home automation system

    OpenAIRE

    Hellmuth, George F.

    1993-01-01

    The systems engineering process is applied in the development of the preliminary design of a home automation communication protocol. The objective of the communication protocol is to provide a means for a personal computer to communicate with adapted appliances in the home. A needs analysis is used to ascertain that a need exist for a home automation system. Numerous design alternatives are suggested and evaluated to determine the best possible protocol design. Coaxial cable...

  2. Status of radiation-based measurement technology

    International Nuclear Information System (INIS)

    Moon, B. S.; Lee, J. W.; Chung, C. E.; Hong, S. B.; Kim, J. T.; Park, W. M.; Kim, J. Y.

    1999-03-01

    This report describes the status of measurement equipment using radiation source and new technologies in this field. This report includes the development status in Korea together with a brief description of the technology development and application status in ten countries including France, America, and Japan. Also this report describes technical factors related to radiation-based measurement and trends of new technologies. Measurement principles are also described for the equipment that is widely used among radiation-based measurement, such as level measurement, density measurement, basis weight measurement, moisture measurement, and thickness measurement. (author). 7 refs., 2 tabs., 21 figs

  3. Development of a Computer Program for the Integrated Control of the Fuel Homogeneity Measurement System

    International Nuclear Information System (INIS)

    Shin, H. S.; Jang, J. W.; Lee, Y. H.; Oh, S. J.; Park, H. D.; Kim, C. K.

    2005-11-01

    The computer program is developed based on Visual C++, which is equipped with a user-friendly interface of the input/output(I/O) and a display function for the measuring conditions. This program consists of three parts which are the port communication, PLC(Programmable Logic Controller) and the MCA(Multi Channel Analyzer) control parts. The communication type between the CPU of the PLC module box and the computer is selected as be the Rs-232 asynchronous type and the thread method is adapted in the development of the first part of the program. The PLC-related program has been developed so that the data communication between the PLC CPU and the computer could be harmonized with the unique commands which have already been defined in the PLC. The measuring space and time intervals, the start and end ROI(region of interest) values, and the allowable error limitation are input at each measurement in this program. Finally the controlling MCA program has been developed by using Canberra's programming library which contains several files including the head files in which the variable and the function of C++ are declared according to the MCA function. The performance test has been carried out through an application of the developed computer program to the homogeneity measurement system. The gamma counts at 28 measuring points of a fuel rod of 700 mm in length are measured for 50 sec at each point. It was revealed that the measurement results are better than the previous ones in respects of the measurement accuracy and a measurement time saving could be achieved. It was concluded that the gamma measurement system can be improved through equipping it with the developed control program

  4. Computer-based literature search in medical institutions in India

    Directory of Open Access Journals (Sweden)

    Kalita Jayantee

    2007-01-01

    Full Text Available Aim: To study the use of computer-based literature search and its application in clinical training and patient care as a surrogate marker of evidence-based medicine. Materials and Methods: A questionnaire comprising of questions on purpose (presentation, patient management, research, realm (site accessed, nature and frequency of search, effect, infrastructure, formal training in computer based literature search and suggestions for further improvement were sent to residents and faculty of a Postgraduate Medical Institute (PGI and a Medical College. The responses were compared amongst different subgroups of respondents. Results: Out of 300 subjects approached 194 responded; of whom 103 were from PGI and 91 from Medical College. There were 97 specialty residents, 58 super-specialty residents and 39 faculty members. Computer-based literature search was done at least once a month by 89% though there was marked variability in frequency and extent. The motivation for computer-based literature search was for presentation in 90%, research in 65% and patient management in 60.3%. The benefit of search was acknowledged in learning and teaching by 80%, research by 65% and patient care by 64.4% of respondents. Formal training in computer based literature search was received by 41% of whom 80% were residents. Residents from PGI did more frequent and more extensive computer-based literature search, which was attributed to better infrastructure and training. Conclusion: Training and infrastructure both are crucial for computer-based literature search, which may translate into evidence based medicine.

  5. SQUID-based measuring systems

    Indian Academy of Sciences (India)

    field produced by a given two-dimensional current density distribution is inverted using the Fourier transform technique. Keywords ... Superconducting quantum interference devices (SQUIDs) are the most sensitive detectors for measurement of ... omagnetic prospecting, detection of gravity waves etc. Judging the importance ...

  6. Computer processing of the Δlambda/lambda measured results

    International Nuclear Information System (INIS)

    Draguniene, V.J.; Makariuniene, E.K.

    1979-01-01

    For the processing of the experimental data on the influence of the chemical environment on the radioactive decay constants, five programs have been written in the Fortran language in the version for the monitoring system DUBNA on the BESM-6 computer. Each program corresponds to a definite stage of data processing and acquirement of the definite answer. The first and second programs are calculation of the ratio of the pulse numbers measured with different sources and calculation of the mean value of dispersions. The third program is the averaging of the ratios of the pulse numbers. The fourth and the fifth are determination of the change of the radioactive decay constant. The created programs for the processing of the measurement results permit the processing of the experimental data beginning from the values of pulse numbers obtained directly in the experiments. The programs allow to treat a file of the experimental results, to calculated various errors in all the stages of the calculations. Printing of the obtained results is convenient for usage

  7. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  8. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  9. Computation and measurement of air temperature distribution of an industrial melt blowing die

    Directory of Open Access Journals (Sweden)

    Wu Li-Li

    2014-01-01

    Full Text Available The air flow field of the dual slot die on an HDF-6D melt blowing non-woven equipment is computed numerically. A temperature measurement system is built to measure air temperatures. The computation results tally with the measured results proving the correctness of the computation. The results have great valuable significance in the actual melt blowing production.

  10. Computed tomography measurement of rib cage morphometry in emphysema.

    Directory of Open Access Journals (Sweden)

    Nicola Sverzellati

    Full Text Available BACKGROUND: Factors determining the shape of the human rib cage are not completely understood. We aimed to quantify the contribution of anthropometric and COPD-related changes to rib cage variability in adult cigarette smokers. METHODS: Rib cage diameters and areas (calculated from the inner surface of the rib cage in 816 smokers with or without COPD, were evaluated at three anatomical levels using computed tomography (CT. CTs were analyzed with software, which allows quantification of total emphysema (emphysema%. The relationship between rib cage measurements and anthropometric factors, lung function indices, and %emphysema were tested using linear regression models. RESULTS: A model that included gender, age, BMI, emphysema%, forced expiratory volume in one second (FEV1%, and forced vital capacity (FVC% fit best with the rib cage measurements (R(2 = 64% for the rib cage area variation at the lower anatomical level. Gender had the biggest impact on rib cage diameter and area (105.3 cm(2; 95% CI: 111.7 to 98.8 for male lower area. Emphysema% was responsible for an increase in size of upper and middle CT areas (up to 5.4 cm(2; 95% CI: 3.0 to 7.8 for an emphysema increase of 5%. Lower rib cage areas decreased as FVC% decreased (5.1 cm(2; 95% CI: 2.5 to 7.6 for 10 percentage points of FVC variation. CONCLUSIONS: This study demonstrates that simple CT measurements can predict rib cage morphometric variability and also highlight relationships between rib cage morphometry and emphysema.

  11. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  12. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    Science.gov (United States)

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  13. From computing with numbers to computing with words-from manipulation of measurements to manipulation of perceptions

    Science.gov (United States)

    Zadeh, Lotfi A.

    2001-06-01

    Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc. Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech and summarizing a story. Underlying this remarkable capability is the brain's crucial ability to manipulate perceptions-perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions-a theory which may have an important bearing on how humans make-and machines might make-perception-based rational decisions in an environment of imprecision, uncertainty and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers that are capable of performing billions of computations per second; we have constructed telescopes that can explore the far reaches of the universe; and we can date the age of rocks that are

  14. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  15. The development and application of a coincidence measurement apparatus with micro-computer system

    International Nuclear Information System (INIS)

    Du Hongshan; Zhou Youpu; Gao Junlin; Qin Deming; Cao Yunzheng; Zhao Shiping

    1987-01-01

    A coincidence measurement apparatus with micro-computer system is developed. Automatic data acquisition and processing are achieved. Results of its application for radioactive measurement are satisfactory

  16. Measurement of bone mineral density in the tunnel regions for anterior cruciate ligament reconstruction by dual-energy X-ray absorptiometry, computed tomography scan, and the immersion technique based on Archimedes' principle.

    Science.gov (United States)

    Tie, Kai; Wang, Hua; Wang, Xin; Chen, Liaobin

    2012-10-01

    To determine, for anterior cruciate ligament (ACL) reconstruction, whether the bone mineral density (BMD) of the femoral tunnel was higher than that of the tibial tunnel, to provide objective evidence for choosing the appropriate diameter of interference screws. Two groups were enrolled. One group comprised 30 normal volunteers, and the other comprised 9 patients with ACL rupture. Dual-energy X-ray absorptiometry was used to measure the BMD of the femoral and tibial tunnel regions of the volunteers' right knees by choosing a circular area covering the screw fixation region. The knees were also scanned by spiral computed tomography (CT), and the 3-dimensional reconstruction technique was used to determine the circular sections passing through the longitudinal axis of the femoral and tibial tunnels. Grayscale CT values of the cross-sectional area were measured. Cylindrical cancellous bone blocks were removed from the femoral and tibial tunnels during the ACL reconstruction for the patients. The volumetric BMD of the bone blocks was measured using a standardized immersion technique according to Archimedes' principle. As measured by dual-energy X-ray absorptiometry, the BMD of the femoral and tibial tunnel regions was 1.162 ± 0.034 g/cm(2) and 0.814 ± 0.038 g/cm(2), respectively (P difference in both femoral and tibial tunnel regions. For ACL reconstruction, the BMD of the femoral tunnel is higher than that of the tibial tunnel. This implies that a proportionally larger-diameter interference screw should be used for fixation in the proximal tibia than that used for fixation in the distal femur. Level IV, therapeutic case series. Copyright © 2012 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  17. Spin-based quantum computation in multielectron quantum dots

    OpenAIRE

    Hu, Xuedong; Sarma, S. Das

    2001-01-01

    In a quantum computer the hardware and software are intrinsically connected because the quantum Hamiltonian (or more precisely its time development) is the code that runs the computer. We demonstrate this subtle and crucial relationship by considering the example of electron-spin-based solid state quantum computer in semiconductor quantum dots. We show that multielectron quantum dots with one valence electron in the outermost shell do not behave simply as an effective single spin system unles...

  18. High Available COTS Based Computer for Space

    Science.gov (United States)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  19. Many-core computing for space-based stereoscopic imaging

    Science.gov (United States)

    McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry

    The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.

  20. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  1. Measurement of meat color using a computer vision system.

    Science.gov (United States)

    Girolami, Antonio; Napolitano, Fabio; Faraone, Daniela; Braghieri, Ada

    2013-01-01

    The limits of the colorimeter and a technique of image analysis in evaluating the color of beef, pork, and chicken were investigated. The Minolta CR-400 colorimeter and a computer vision system (CVS) were employed to measure colorimetric characteristics. To evaluate the chromatic fidelity of the image of the sample displayed on the monitor, a similarity test was carried out using a trained panel. The panelists found the digital images of the samples visualized on the monitor very similar to the actual ones (Pmeat sample and the sample image on the monitor in order to evaluate the similarity between them (test A). Moreover, the panelists were asked to evaluate the similarity between two colors, both generated by the software Adobe Photoshop CS3 one using the L, a and b values read by the colorimeter and the other obtained using the CVS (test B); which of the two colors was more similar to the sample visualized on the monitor was also assessed (test C). The panelists found the digital images very similar to the actual samples (Pcolors the panelists found significant differences between them (Pcolor of the sample on the monitor was more similar to the CVS generated color than to the colorimeter generated color. The differences between the values of the L, a, b, hue angle and chroma obtained with the CVS and the colorimeter were statistically significant (Pcolor of meat. Instead, the CVS method seemed to give valid measurements that reproduced a color very similar to the real one. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Computation of atmospheric dispersion coefficients from measurements of turbulence parameters

    International Nuclear Information System (INIS)

    Asculai, E.

    1975-04-01

    Some of the spectra of turbulence found in the literature are theoretical and some are experimental. The present work investigates the dependence of the dispersion coefficients (sigma sub(y) especially) on the shape of the spectrum, using the theoretical and the experimental data found in the literature. It seems that, contrary to accepted concepts, the value of P (in the proportion sigma α Tsup(P)) is larger under stable, than under unstable conditions. These values are of order 1, which does not agree with Taylor's asymptotic value of 1/2. The influence of the characteristics of the instrument - especially the time constant - on the estimation of sigma sub(y) is discussed. Inaccurate estimate of sigmasub(y) may result in underestimating concentrations by an order of magnitude (or even more). The results of the computations of sigma sub(y) for various release times given here enable a more accurate estimate of those concentrations. The results of a series of measurements demonstrating the principles discussed are presented, indicating a practical way of estimating the dispersion coefficients. (author)

  3. Computer-Based Self-Instructional Modules. Final Technical Report.

    Science.gov (United States)

    Weinstock, Harold

    Reported is a project involving seven chemists, six mathematicians, and six physicists in the production of computer-based, self-study modules for use in introductory college courses in chemistry, physics, and mathematics. These modules were designed to be used by students and instructors with little or no computer backgrounds, in institutions…

  4. Touch-based Brain Computer Interfaces: State of the art

    NARCIS (Netherlands)

    Erp, J.B.F. van; Brouwer, A.M.

    2014-01-01

    Brain Computer Interfaces (BCIs) rely on the user's brain activity to control equipment or computer devices. Many BCIs are based on imagined movement (called active BCIs) or the fact that brain patterns differ in reaction to relevant or attended stimuli in comparison to irrelevant or unattended

  5. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  6. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  7. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  8. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  9. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  10. The relationship between pharyngeal morphology measured with cone-beam computed tomography and maxillary morphology measured by lateral cephalogram

    International Nuclear Information System (INIS)

    Yamaguchi, Fumie; Yamaguchi, Tetsutaro; Miyamoto, Asami; Maki, Koutaro

    2007-01-01

    This study examined the relationship between pharyngeal morphology measured with cone-beam computed tomography (CBCT) and maxillary morphology measured from lateral cephalograms. The subjects comprised 45 women, with a mean age of 27.9 years (range, 16-50 years), who attended the Department of Orthodontics at Showa University. The evaluation of pharyngeal morphology was based on 9 variables measured by CBCT: pharyngeal space volume, pharyngeal vertical length, pharyngeal sagittal length, pharyngeal coronal length, epiglottis length, epiglottis width, the distance from the genion to the hyoidale, the distance from the hyoidale to the aditus larynges base, and the distance from the aditus larynges base to the genion. Maxillary morphology was evaluated from 5 measured sites: SNA, S'-Ptm', A'-Ptm', the occiusal plane angle, and the palatal plane angle. Pearson's correlation coefficient was used to detect associations between pharyngeal and maxillary morphological variables. There were significant correlations between pharyngeal coronal length and SNA, the distance from the genion to the hyoidale and the occlusal plane angle, pharyngeal coronal length and A'-Ptm', pharyngeal vertical length and the palatal plane angle, as well as the aditus larynges base to the genion and the occlusal plane. This information has potential clinical value for better understanding obstructive sleep apnea in adult patients, and for structurally based treatments such as surgical orthodontics. (author)

  11. Quantified measurement of brain blood volume: comparative evaluations between the single photon emission computer tomography and the positron computer tomography

    International Nuclear Information System (INIS)

    Bouvard, G.; Fernandez, Y.; Petit-Taboue, M.C.; Derlon, J.M.; Travere, J.M.; Le Poec, C.

    1991-01-01

    The quantified measurement of cerebral blood volume is interesting for the brain blood circulation studies. This measurement is often used in positron computed tomography. It's more difficult in single photon emission computed tomography: there are physical problems with the limited resolution of the detector, the Compton effect and the photon attenuation. The objectif of this study is to compare the results between these two techniques. The quantified measurement of brain blood volume is possible with the single photon emission computer tomogragry. However, there is a loss of contrast [fr

  12. Computer code structure for evaluation of fire protection measures and fighting capability at nuclear plants

    International Nuclear Information System (INIS)

    Anton, V.

    1997-01-01

    In this work a computer code structure for Fire Protection Measures (FPM) and Fire Fighting Capability (FFC) at Nuclear Power Plants (NPP) is presented. It allows to evaluate the category (satisfactory (s), needs for further evaluation (n), unsatisfactory (u)) to which belongs the given NPP for a self-control in view of an IAEA inspection. This possibility of a self assessment resulted from IAEA documents. Our approach is based on international experience gained in this field and stated in IAEA recommendations. As an illustration we used the FORTRAN programming language statement to make clear the structure of the computer code for the problem taken into account. This computer programme can be conceived so that some literal message in English and Romanian languages be displayed beside the percentage assessments. (author)

  13. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  14. A low cost computer-controlled electrochemical measurement system for education and research

    International Nuclear Information System (INIS)

    Cottis, R.A.

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs

  15. A low cost computer-controlled electrochemical measurement system for education and research

    Energy Technology Data Exchange (ETDEWEB)

    Cottis, R A [Manchester Univ. (UK). Inst. of Science and Technology

    1989-01-01

    With the advent of low cost computers of significant processing power, it has become economically attractive, as well as offering practical advantages, to replace conventional electrochemical instrumentation with computer-based equipment. For example, the equipment to be described can perform all of the functions required for the measurement of a potentiodynamic polarization curve, replacing the conventional arrangement of sweep generator, potentiostat and chart recorder at a cost (based on the purchase cost of parts) which is less than that of most chart recorders alone. Additionally the use of computer control at a relatively low level provides a versatility (assuming the development of suitable software) which cannot easily be matched by conventional instruments. As a result of these considerations a simple computer-controlled electrochemical measurement system has been developed, with a primary aim being its use in teaching an MSc class in corrosion science and engineering, with additional applications in MSc and PhD research. For education reasons the design of the user interface has tried to make the internal operation of the unit as obvious as possible, and thereby minimize the tendency for students to treat the unit as a 'black box' with incomprehensible inner workings. This has resulted in a unit in which the three main components of function generator, potentiostat and recorder are presented as independent areas on the front panel, and can be configured by the user in exactly the same way as conventional instruments. (author) 11 figs.

  16. Measuring Human Performance within Computer Security Incident Response Teams

    Energy Technology Data Exchange (ETDEWEB)

    McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

  17. Application of data base management systems for developing experimental data base using ES computers

    International Nuclear Information System (INIS)

    Vasil'ev, V.I.; Karpov, V.V.; Mikhajlyuk, D.N.; Ostroumov, Yu.A.; Rumyantsev, A.N.

    1987-01-01

    Modern data base measurement systems (DBMS) are widely used for development and operation of different data bases by assignment of data processing systems in economy, planning, management. But up today development and operation of data masses with experimental physical data in ES computer has been based mainly on the traditional technology of consequent or index-consequent files. The principal statements of DBMS technology applicability for compiling and operation of data bases with data on physical experiments are formulated based on the analysis of DBMS opportunities. It is shown that application of DBMS allows to essentially reduce general costs of calculational resources for development and operation of data bases and to decrease the scope of stored experimental data when analyzing information content of data

  18. Computer Aided Design Parameters for Forward Basing

    Science.gov (United States)

    1988-12-01

    This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is...Base Module of GWN System’s GWN- DTM software. A simple AutoLISP conversion program (TA2DXF, TA2DXB) within the software converts the TA2 format into an

  19. An Empirical Measure of Computer Security Strength for Vulnerability Remediation

    Science.gov (United States)

    Villegas, Rafael

    2010-01-01

    Remediating all vulnerabilities on computer systems in a timely and cost effective manner is difficult given that the window of time between the announcement of a new vulnerability and an automated attack has decreased. Hence, organizations need to prioritize the vulnerability remediation process on their computer systems. The goal of this…

  20. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  1. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2008-09-01

    Full Text Available The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems are marketed with the ability to accommodate these three common operating system environments, including Windows XP in native and virtual environments. We performed a series of experiments to measure the functionality and performance of the two most commonly used Windows-based computer forensics applications on a Macintosh running Windows XP in native mode and in two virtual environments relative to a similarly configured Dell personal computer. The research results are directly beneficial to practitioners, and the process illustrates affective pedagogy whereby students were engaged in applied research.

  2. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    Science.gov (United States)

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual

  3. Experimental and computational studies on a gasifier based stove

    International Nuclear Information System (INIS)

    Varunkumar, S.; Rajan, N.K.S.; Mukunda, H.S.

    2012-01-01

    Highlights: ► A simple method to calculate the fraction of HHC was devised. ► η g for stove is same as that of a downdraft gasifier. ► Gas from stove contains 5.5% of CH 4 equivalent of HHC. ► Effect of vessel size on utilization efficiency brought out clearly. ► Contribution of radiative heat transfer from char bed to efficiency is 6%. - Abstract: The work reported here is concerned with a detailed thermochemical evaluation of the flaming mode behaviour of a gasifier based stove. Determination of the gas composition over the fuel bed, surface and gas temperatures in the gasification process constitute principal experimental features. A simple atomic balance for the gasification reaction combined with the gas composition from the experiments is used to determine the CH 4 equivalent of higher hydrocarbons and the gasification efficiency (η g ). The components of utilization efficiency, namely, gasification–combustion and heat transfer are explored. Reactive flow computational studies using the measured gas composition over the fuel bed are used to simulate the thermochemical flow field and heat transfer to the vessel; hither-to-ignored vessel size effects in the extraction of heat from the stove are established clearly. The overall flaming mode efficiency of the stove is 50–54%; the convective and radiative components of heat transfer are established to be 45–47 and 5–7% respectively. The efficiency estimates from reacting computational fluid dynamics (RCFD) compare well with experiments.

  4. Near infrared spectroscopy based brain-computer interface

    Science.gov (United States)

    Ranganatha, Sitaram; Hoshi, Yoko; Guan, Cuntai

    2005-04-01

    A brain-computer interface (BCI) provides users with an alternative output channel other than the normal output path of the brain. BCI is being given much attention recently as an alternate mode of communication and control for the disabled, such as patients suffering from Amyotrophic Lateral Sclerosis (ALS) or "locked-in". BCI may also find applications in military, education and entertainment. Most of the existing BCI systems which rely on the brain's electrical activity use scalp EEG signals. The scalp EEG is an inherently noisy and non-linear signal. The signal is detrimentally affected by various artifacts such as the EOG, EMG, ECG and so forth. EEG is cumbersome to use in practice, because of the need for applying conductive gel, and the need for the subject to be immobile. There is an urgent need for a more accessible interface that uses a more direct measure of cognitive function to control an output device. The optical response of Near Infrared Spectroscopy (NIRS) denoting brain activation can be used as an alternative to electrical signals, with the intention of developing a more practical and user-friendly BCI. In this paper, a new method of brain-computer interface (BCI) based on NIRS is proposed. Preliminary results of our experiments towards developing this system are reported.

  5. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  6. An overview of computer-based natural language processing

    Science.gov (United States)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  7. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    Science.gov (United States)

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  8. Bluetooth-based distributed measurement system

    International Nuclear Information System (INIS)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng

    2007-01-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit

  9. Bluetooth-based distributed measurement system

    Science.gov (United States)

    Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng

    2007-07-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  10. Bluetooth-based distributed measurement system

    Energy Technology Data Exchange (ETDEWEB)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng [Department of Mechatronics, College of Mechanical Engineering, Chongqing University, Chongqing, 400030 (China)

    2007-07-15

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  11. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  12. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  13. Sensitivity Measurement of Transmission Computer Tomography: thePreliminary Experimental Study

    International Nuclear Information System (INIS)

    Widodo, Chomsin-S; Sudjatmoko; Kusminarto; Agung-BS Utomo; Suparta, Gede B

    2000-01-01

    This paper reports result of preliminary experimental study onmeasurement method for sensitivity of a computed tomography (CT) scanner. ACT scanner has been build at the Department of Physics, FMIPA UGM and itsperformance based on its sensitivity was measured. The result showed that themeasurement method for sensitivity confirmed this method may be developedfurther as a measurement standard. Although the CT scanner developed has anumber of shortcoming, the analytical results from the sensitivitymeasurement suggest a number of reparations and improvements for the systemso that improved reconstructed CT images can be obtained. (author)

  14. Safeguards instrumentation: a computer-based catalog

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available

  15. Safeguards instrumentation: a computer-based catalog

    Energy Technology Data Exchange (ETDEWEB)

    Fishbone, L.G.; Keisch, B.

    1981-08-01

    The information contained in this catalog is needed to provide a data base for safeguards studies and to help establish criteria and procedures for international safeguards for nuclear materials and facilities. The catalog primarily presents information on new safeguards equipment. It also describes entire safeguards systems for certain facilities, but it does not describe the inspection procedures. Because IAEA safeguards do not include physical security, devices for physical protection (as opposed to containment and surveillance) are not included. An attempt has been made to list capital costs, annual maintenance costs, replacement costs, and useful lifetime for the equipment. For equipment which is commercially available, representative sources have been listed whenever available.

  16. Computer Aided Measurement Laser (CAML): technique to quantify post-mastectomy lymphoedema

    International Nuclear Information System (INIS)

    Trombetta, Chiara; Abundo, Paolo; Felici, Antonella; Ljoka, Concetta; Foti, Calogero; Cori, Sandro Di; Rosato, Nicola

    2012-01-01

    Lymphoedema can be a side effect of cancer treatment. Eventhough several methods for assessing lymphoedema are used in clinical practice, an objective quantification of lymphoedema has been problematic. The aim of the study was to determine the objectivity, reliability and repeatability of the computer aided measurement laser (CAML) technique. CAML technique is based on computer aided design (CAD) methods and requires an infrared laser scanner. Measurements are scanned and the information describing size and shape of the limb allows to design the model by using the CAD software. The objectivity and repeatability was established in the beginning using a phantom. Consequently a group of subjects presenting post-breast cancer lymphoedema was evaluated using as a control the contralateral limb. Results confirmed that in clinical settings CAML technique is easy to perform, rapid and provides meaningful data for assessing lymphoedema. Future research will include a comparison of upper limb CAML technique between healthy subjects and patients with known lymphoedema.

  17. Towards a fullerene-based quantum computer

    International Nuclear Information System (INIS)

    Benjamin, Simon C; Ardavan, Arzhang; Briggs, G Andrew D; Britz, David A; Gunlycke, Daniel; Jefferson, John; Jones, Mark A G; Leigh, David F; Lovett, Brendon W; Khlobystov, Andrei N; Lyon, S A; Morton, John J L; Porfyrakis, Kyriakos; Sambrook, Mark R; Tyryshkin, Alexei M

    2006-01-01

    Molecular structures appear to be natural candidates for a quantum technology: individual atoms can support quantum superpositions for long periods, and such atoms can in principle be embedded in a permanent molecular scaffolding to form an array. This would be true nanotechnology, with dimensions of order of a nanometre. However, the challenges of realizing such a vision are immense. One must identify a suitable elementary unit and demonstrate its merits for qubit storage and manipulation, including input/output. These units must then be formed into large arrays corresponding to an functional quantum architecture, including a mechanism for gate operations. Here we report our efforts, both experimental and theoretical, to create such a technology based on endohedral fullerenes or 'buckyballs'. We describe our successes with respect to these criteria, along with the obstacles we are currently facing and the questions that remain to be addressed

  18. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  19. Automatic computation of 2D cardiac measurements from B-mode echocardiography

    Science.gov (United States)

    Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin

    2012-03-01

    We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.

  20. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography.

    Science.gov (United States)

    Jiang, Lan; Templeton, Alistair; Turian, Julius; Kirk, Michael; Zusag, Thomas; Chu, James C H

    2011-01-01

    Identification of source positions after implantation is an important step in brachytherapy planning. Reconstruction is traditionally performed from films taken by conventional simulators, but these are gradually being replaced in the clinic by computed tomography (CT) simulators. The present study explored the use of a scout image-based reconstruction algorithm that replaces the use of traditional film, while exhibiting low sensitivity to metal-induced artifacts that can appear in 3D CT methods. In addition, the accuracy of an in-house graphical software implementation of scout-based reconstruction was compared with seed location reconstructions for 2 phantoms by conventional simulator and CT measurements. One phantom was constructed using a planar fixed grid of 1.5-mm diameter ball bearings (BBs) with 40-mm spacing. The second was a Fletcher-Suit applicator embedded in Styrofoam (Dow Chemical Co., Midland, MI) with one 3.2-mm-diameter BB inserted into each of 6 surrounding holes. Conventional simulator, kilovoltage CT (kVCT), megavoltage CT, and scout-based methods were evaluated by their ability to calculate the distance between seeds (40 mm for the fixed grid, 30-120 mm in Fletcher-Suit). All methods were able to reconstruct the fixed grid distances with an average deviation of <1%. The worst single deviations (approximately 6%) were exhibited in the 2 volumetric CT methods. In the Fletcher-Suit phantom, the intermodality agreement was within approximately 3%, with the conventional sim measuring marginally larger distances, with kVCT the smallest. All of the established reconstruction methods exhibited similar abilities to detect the distances between BBs. The 3D CT-based methods, with lower axial resolution, showed more variation, particularly with the smaller BBs. With a software implementation, scout-based reconstruction is an appealing approach because it simplifies data acquisition over film-based reconstruction without requiring any specialized equipment

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  2. Computational aeroelasticity using a pressure-based solver

    Science.gov (United States)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  3. Sex Prediction using Foramen Magnum and Occipital Condyles Computed Tomography Measurements in Sudanese Population

    Directory of Open Access Journals (Sweden)

    Usama Mohamed El-Barrany

    2016-12-01

    Full Text Available Sex determination is important in establishing the identity of an individual. The foramen magnum is an important landmark of the skull base. The present research aimed to study the value of foramen magnum measurements to determine sex using computed tomography (CT among Sudanese individuals. Foramen magnum CT scans of 400 Sudanese individuals (200 males and 200 females aged 18 - 83 years were included in this study. Foramen magnum (length and width, right occipital condyle (length and width, left occipital condyle (length and width, minimum intercondylar distance, maximum bicondylar distance and maximum medial intercondylar distance were measured. All data were subjected to discriminant functions analysis. All nine measurements were significantly higher in males than females. Among these measurements, the right condyle length, minimum intercondylar distance, and foramen magnum width were able to determine sex in Sudanese individuals with an accuracy rate of 83 %.

  4. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Fog computing job scheduling optimization based on bees swarm

    Science.gov (United States)

    Bitam, Salim; Zeadally, Sherali; Mellouk, Abdelhamid

    2018-04-01

    Fog computing is a new computing architecture, composed of a set of near-user edge devices called fog nodes, which collaborate together in order to perform computational services such as running applications, storing an important amount of data, and transmitting messages. Fog computing extends cloud computing by deploying digital resources at the premise of mobile users. In this new paradigm, management and operating functions, such as job scheduling aim at providing high-performance, cost-effective services requested by mobile users and executed by fog nodes. We propose a new bio-inspired optimization approach called Bees Life Algorithm (BLA) aimed at addressing the job scheduling problem in the fog computing environment. Our proposed approach is based on the optimized distribution of a set of tasks among all the fog computing nodes. The objective is to find an optimal tradeoff between CPU execution time and allocated memory required by fog computing services established by mobile users. Our empirical performance evaluation results demonstrate that the proposal outperforms the traditional particle swarm optimization and genetic algorithm in terms of CPU execution time and allocated memory.

  6. Performance measurements in 3D ideal magnetohydrodynamic stability computations

    International Nuclear Information System (INIS)

    Anderson, D.V.; Cooper, W.A.; Gruber, R.; Schwenn, U.

    1989-10-01

    The 3D ideal magnetohydrodynamic stability code TERPSICHORE has been designed to take advantage of vector and microtasking capabilities of the latest CRAY computers. To keep the number of operations small most efficient algorithms have been applied in each computational step. The program investigates the stability properties of fusion reactor relevant plasma configurations confined by magnetic fields. For a typical 3D HELIAS configuration that has been considered we obtain an overall performance in excess of 1 Gflops on an eight processor CRAY-YMP machine. (author) 3 figs., 1 tab., 11 refs

  7. Spintronic Circuits: The Building Blocks of Spin-Based Computation

    Directory of Open Access Journals (Sweden)

    Roshan Warman

    2016-10-01

    Full Text Available In the most general situation, binary computation is implemented by means of microscopic logical gates known as transistors. According to Moore’s Law, the size of transistors will half every two years, and as these transistors reach their fundamental size limit, the quantum effects of the electrons passing through the transistors will be observed. Due to the inherent randomness of these quantum fluctuations, the basic binary logic will become uncontrollable. This project describes the basic principle governing quantum spin-based computing devices, which may provide an alternative to the conventional solid-state computing devices and circumvent the technological limitations of the current implementation of binary logic.

  8. Nanophotonic quantum computer based on atomic quantum transistor

    International Nuclear Information System (INIS)

    Andrianov, S N; Moiseev, S A

    2015-01-01

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  9. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  10. Nanophotonic quantum computer based on atomic quantum transistor

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, S N [Institute of Advanced Research, Academy of Sciences of the Republic of Tatarstan, Kazan (Russian Federation); Moiseev, S A [Kazan E. K. Zavoisky Physical-Technical Institute, Kazan Scientific Center, Russian Academy of Sciences, Kazan (Russian Federation)

    2015-10-31

    We propose a scheme of a quantum computer based on nanophotonic elements: two buses in the form of nanowaveguide resonators, two nanosized units of multiatom multiqubit quantum memory and a set of nanoprocessors in the form of photonic quantum transistors, each containing a pair of nanowaveguide ring resonators coupled via a quantum dot. The operation modes of nanoprocessor photonic quantum transistors are theoretically studied and the execution of main logical operations by means of them is demonstrated. We also discuss the prospects of the proposed nanophotonic quantum computer for operating in high-speed optical fibre networks. (quantum computations)

  11. ISAT promises fail-safe computer-based reactor protection

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    AEA Technology's ISAT system is a multiplexed microprocessor-based reactor protection system which has very extensive self-monitoring capabilities and is inherently fail safe. It provides a way of addressing software reliability problems that have tended to hamper widespread introduction of computer-based reactor protection. (author)

  12. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  13. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  14. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  15. On computation of Groebner bases for linear difference systems

    Energy Technology Data Exchange (ETDEWEB)

    Gerdt, Vladimir P. [Laboratory of Information Technologies, Joint Institute for Nuclear Research, 141980 Dubna (Russian Federation)]. E-mail: gerdt@jinr.ru

    2006-04-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions.

  16. On computation of Groebner bases for linear difference systems

    International Nuclear Information System (INIS)

    Gerdt, Vladimir P.

    2006-01-01

    In this paper, we present an algorithm for computing Groebner bases of linear ideals in a difference polynomial ring over a ground difference field. The input difference polynomials generating the ideal are also assumed to be linear. The algorithm is an adaptation to difference ideals of our polynomial algorithm based on Janet-like reductions

  17. Issues in Text Design and Layout for Computer Based Communications.

    Science.gov (United States)

    Andresen, Lee W.

    1991-01-01

    Discussion of computer-based communications (CBC) focuses on issues involved with screen design and layout for electronic text, based on experiences with electronic messaging, conferencing, and publishing within the Australian Open Learning Information Network (AOLIN). Recommendations for research on design and layout for printed text are also…

  18. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  19. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  20. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  1. Comparative study of cranial anthropometric measurement by traditional calipers to computed tomography and three-dimensional photogrammetry.

    Science.gov (United States)

    Mendonca, Derick A; Naidoo, Sybill D; Skolnick, Gary; Skladman, Rachel; Woo, Albert S

    2013-07-01

    Craniofacial anthropometry by direct caliper measurements is a common method of quantifying the morphology of the cranial vault. New digital imaging modalities including computed tomography and three-dimensional photogrammetry are similarly being used to obtain craniofacial surface measurements. This study sought to compare the accuracy of anthropometric measurements obtained by calipers versus 2 methods of digital imaging.Standard anterior-posterior, biparietal, and cranial index measurements were directly obtained on 19 participants with an age range of 1 to 20 months. Computed tomographic scans and three-dimensional photographs were both obtained on each child within 2 weeks of the clinical examination. Two analysts measured the anterior-posterior and biparietal distances on the digital images. Measures of reliability and bias between the modalities were calculated and compared.Caliper measurements were found to underestimate the anterior-posterior and biparietal distances as compared with those of the computed tomography and the three-dimensional photogrammetry (P photogrammetry (P = 0.002). The coefficients of variation for repeated measures based on the computed tomography and the three-dimensional photogrammetry were 0.008 and 0.007, respectively.In conclusion, measurements based on digital modalities are generally reliable and interchangeable. Caliper measurements lead to underestimation of anterior-posterior and biparietal values compared with digital imaging.

  2. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  3. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Throneburg, E. B.; Jones, J. M. [AREVA NP Inc., 7207 IBM Drive, Charlotte, NC 28262 (United States)

    2006-07-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  4. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    International Nuclear Information System (INIS)

    Throneburg, E. B.; Jones, J. M.

    2006-01-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  5. Link-Based Similarity Measures Using Reachability Vectors

    Directory of Open Access Journals (Sweden)

    Seok-Ho Yoon

    2014-01-01

    Full Text Available We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures.

  6. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  7. Measuring Impact of EPAs Computational Toxicology Research (BOSC)

    Science.gov (United States)

    Computational Toxicology (CompTox) research at the EPA was initiated in 2005. Since 2005, CompTox research efforts have made tremendous advances in developing new approaches to evaluate thousands of chemicals for potential health effects. The purpose of this case study is to trac...

  8. Measuring Multimodal Synchrony for Human-Computer Interaction

    NARCIS (Netherlands)

    Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

    2010-01-01

    Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

  9. Computing the Expected Value and Variance of Geometric Measures

    DEFF Research Database (Denmark)

    Staals, Frank; Tsirogiannis, Constantinos

    2017-01-01

    distance (MPD), the squared Euclidean distance from the centroid, and the diameter of the minimum enclosing disk. We also describe an efficient (1-e)-approximation algorithm for computing the mean and variance of the mean pairwise distance. We implemented three of our algorithms and we show that our...

  10. Computer programs for optical dendrometer measurements of standing tree profiles

    Science.gov (United States)

    Jacob R. Beard; Thomas G. Matney; Emily B. Schultz

    2015-01-01

    Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...

  11. Residual stresses in multilayer ceramic capacitors: measurement and computation

    NARCIS (Netherlands)

    Toonder, den J.M.J.; Rademaker, C.W.; Hu, C.L.

    2003-01-01

    In this paper, we present a combined experimental and computational study of the thermomechanical reliability of multilayer ceramic capacitors (MLCC's). We focus on residual stresses introduced into the components during the cooling down step of the sintering process. The technique of

  12. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  13. Errors in measuring absorbed radiation and computing crop radiation use efficiency

    International Nuclear Information System (INIS)

    Gallo, K.P.; Daughtry, C.S.T.; Wiegand, C.L.

    1993-01-01

    Radiation use efficiency (RUE) is often a crucial component of crop growth models that relate dry matter production to energy received by the crop. RUE is a ratio that has units g J -1 , if defined as phytomass per unit of energy received, and units J J -1 , if defined as the energy content of phytomass per unit of energy received. Both the numerator and denominator in computation of RUE can vary with experimental assumptions and methodologies. The objectives of this study were to examine the effect that different methods of measuring the numerator and denominator have on the RUE of corn (Zea mays L.) and to illustrate this variation with experimental data. Computational methods examined included (i) direct measurements of the fraction of photosynthetically active radiation absorbed (f A ), (ii) estimates of f A derived from leaf area index (LAI), and (iii) estimates of f A derived from spectral vegetation indices. Direct measurements of absorbed PAR from planting to physiological maturity of corn were consistently greater than the indirect estimates based on green LAI or the spectral vegetation indices. Consequently, the RUE calculated using directly measured absorbed PAR was lower than the RUE calculated using the indirect measures of absorbed PAR. For crops that contain senesced vegetation, green LAI and the spectral vegetation indices provide appropriate estimates of the fraction of PAR absorbed by a crop canopy and, thus, accurate estimates of crop radiation use efficiency

  14. Measurement and computation for sag of calandria tube due to irradiation creep in PHWR

    International Nuclear Information System (INIS)

    Son, S. M.; Lee, W. R.; Lee, S. K.; Lee, J. S.; Kim, T. R.; Na, B. K.; Namgung I.

    2003-01-01

    Calandria tubes and Liquid Injection Shutdown System(LISS) tubes in a Pressurized Heavy Water Reactor(PHWR) are to sag due to irradiation creep and growth during plant operation. When the sag of calandria tube becomes bigger, the calandria tube possibly comes in contact with LISS tube crossing beneath the calandria tube. The contact subsequently may cause the damage on the calandria tube resulting in unpredicted outage of the plant. It is therefore necessary to check the gap between the two tubes in order to periodically confirm no contact by using a proper measure during the plant life. An ultrasonic gap measuring probe assembly which can be inserted into two viewing ports of the calandria was developed in Korea and utilized to measure the sags of both tubes in the PHWR. It was found that the centerlines of calandria tubes and liquid injection shutdown system tubes can be precisely detected by ultrasonic wave. The gaps between two tubes were easily obtained from the relative distance of the measured centerline elevations of the tubes. Based on the irradiation creep equation and the measurement data, a computer program to calculate the sags was also developed. With the computer program, the sag at the end of plant life was predicted

  15. Measurement system for nitrous oxide based on amperometric gas sensor

    Science.gov (United States)

    Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.

    2017-03-01

    It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.

  16. Standardized computer-based organized reporting of EEG:SCORE

    DEFF Research Database (Denmark)

    Beniczky, Sandor; H, Aurlien,; JC, Brøgger,

    2013-01-01

    process, organized by the European Chapter of the International Federation of Clinical Neurophysiology. The Standardised Computer-based Organised Reporting of EEG (SCORE) software was constructed based on the terms and features of the consensus statement and it was tested in the clinical practice...... in free-text format. The purpose of our endeavor was to create a computer-based system for EEG assessment and reporting, where the physicians would construct the reports by choosing from predefined elements for each relevant EEG feature, as well as the clinical phenomena (for video-EEG recordings....... SCORE can potentially improve the quality of EEG assessment and reporting; it will help incorporate the results of computer-assisted analysis into the report, it will make possible the build-up of a multinational database, and it will help in training young neurophysiologists....

  17. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  18. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  19. Measuring the impact of different brands of computer systems on the clinical consultation: a pilot study

    Directory of Open Access Journals (Sweden)

    Charlotte Refsum

    2008-07-01

    Conclusion This methodological development improves the reliability of our method for measuring the impact of different computer systems on the GP consultation. UAR added more objectivity to the observationof doctor_computer interactions. If larger studies were to reproduce the differences between computer systems demonstrated in this pilot it might be possible to make objective comparisons between systems.

  20. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  1. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  2. Pulmonary tumor measurements from x-ray computed tomography in one, two, and three dimensions.

    Science.gov (United States)

    Villemaire, Lauren; Owrangi, Amir M; Etemad-Rezai, Roya; Wilson, Laura; O'Riordan, Elaine; Keller, Harry; Driscoll, Brandon; Bauman, Glenn; Fenster, Aaron; Parraga, Grace

    2011-11-01

    We evaluated the accuracy and reproducibility of three-dimensional (3D) measurements of lung phantoms and patient tumors from x-ray computed tomography (CT) and compared these to one-dimensional (1D) and two-dimensional (2D) measurements. CT images of three spherical and three irregularly shaped tumor phantoms were evaluated by three observers who performed five repeated measurements. Additionally, three observers manually segmented 29 patient lung tumors five times each. Follow-up imaging was performed for 23 tumors and response criteria were compared. For a single subject, imaging was performed on nine occasions over 2 years to evaluate multidimensional tumor response. To evaluate measurement accuracy, we compared imaging measurements to ground truth using analysis of variance. For estimates of precision, intraobserver and interobserver coefficients of variation and intraclass correlations (ICC) were used. Linear regression and Pearson correlations were used to evaluate agreement and tumor response was descriptively compared. For spherical shaped phantoms, all measurements were highly accurate, but for irregularly shaped phantoms, only 3D measurements were in high agreement with ground truth measurements. All phantom and patient measurements showed high intra- and interobserver reproducibility (ICC >0.900). Over a 2-year period for a single patient, there was disagreement between tumor response classifications based on 3D measurements and those generated using 1D and 2D measurements. Tumor volume measurements were highly reproducible and accurate for irregular, spherical phantoms and patient tumors with nonuniform dimensions. Response classifications obtained from multidimensional measurements suggest that 3D measurements provide higher sensitivity to tumor response. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  3. Korean Clinic Based Outcome Measure Studies

    OpenAIRE

    Jongbae Park

    2003-01-01

    Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented...

  4. A personal computer-based nuclear magnetic resonance spectrometer

    Science.gov (United States)

    Job, Constantin; Pearson, Robert M.; Brown, Michael F.

    1994-11-01

    Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.

  5. Accuracy of measurement of pulmonary emphysema with computed tomography: relevant points

    International Nuclear Information System (INIS)

    Hochhegger, Bruno; Marchiori, Edson; Oliveira, Hugo

    2010-01-01

    Some technical aspects should be taken into consideration in order to guarantee the reliability of the assessment of pulmonary emphysema with lung computed tomography densitometry. Changes in lung density associated with variations in lungs inspiratory and expiratory levels, computed tomography slice thickness, reconstruction algorithm and type of computed tomography apparatus make tomographic comparisons more difficult in follow-up studies of pulmonary emphysema. Nevertheless, quantitative computed tomography has replaced the visual assessment competing with pulmonary function tests as a sensitive method to measure pulmonary emphysema. The present review discusses technical variables of lung computed tomography and their influence on measurements of pulmonary emphysema. (author)

  6. Accuracy of measurement of pulmonary emphysema with computed tomography: relevant points

    Energy Technology Data Exchange (ETDEWEB)

    Hochhegger, Bruno, E-mail: brunohochhegger@googlemail.co [Hospital Moinhos de Vento, Porto Alegre, RS (Brazil); Marchiori, Edson [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil). Dept. de Radiologia; Irion, Klaus L. [Liverpool Heart and Chest Hospital, Liverpool (United Kingdom); Oliveira, Hugo [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil)

    2010-07-15

    Some technical aspects should be taken into consideration in order to guarantee the reliability of the assessment of pulmonary emphysema with lung computed tomography densitometry. Changes in lung density associated with variations in lungs inspiratory and expiratory levels, computed tomography slice thickness, reconstruction algorithm and type of computed tomography apparatus make tomographic comparisons more difficult in follow-up studies of pulmonary emphysema. Nevertheless, quantitative computed tomography has replaced the visual assessment competing with pulmonary function tests as a sensitive method to measure pulmonary emphysema. The present review discusses technical variables of lung computed tomography and their influence on measurements of pulmonary emphysema. (author)

  7. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  8. Remote media vision-based computer input device

    Science.gov (United States)

    Arabnia, Hamid R.; Chen, Ching-Yi

    1991-11-01

    In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.

  9. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  10. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  11. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  12. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  13. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  14. GPU-based high-performance computing for radiation therapy

    International Nuclear Information System (INIS)

    Jia, Xun; Jiang, Steve B; Ziegenhein, Peter

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. The graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of study has been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this paper, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. (topical review)

  15. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  16. Measurement of Patient Dose from Computed Tomography Using Physical Anthropomorphic Phantom

    International Nuclear Information System (INIS)

    Jang, Ki Won; Lee, Jae Ki; Kim, Jong Kyung

    2005-01-01

    The computed tomography (CT) provides a high quality in images of human body but contributes relatively high patient dose compared with the conventional X-ray examination. Furthermore, the frequency of CT examination has been increasing in Korea for the last decade owing to the national health insurance benefits. Increasing concerns about high patient dose from CT have stimulated a great deal of researches on dose assessment, which many of these are based on the Monte Carlo simulation. But in this study, absorbed doses and effective dose of patient undergoing CT examination were determined experimentally using anthropomorphic physical phantom and the measured results are compared with those from Monte Carlo calculation

  17. Computer Network Availability at Sandia National Laboratories, Albuquerque NM: Measurement and Perception; TOPICAL

    International Nuclear Information System (INIS)

    NELSON, SPENCER D.; TOLENDINO, LAWRENCE F.

    1999-01-01

    The desire to provide a measure of computer network availability at Sandia National Laboratories has existed for along time. Several attempts were made to build this measure by accurately recording network failures, identifying the type of network element involved, the root cause of the problem, and the time to repair the fault. Recognizing the limitations of available methods, it became obvious that another approach of determining network availability had to be defined. The chosen concept involved the periodic sampling of network services and applications from various network locations. A measure of ''network'' availability was then calculated based on the ratio of polling success to failure. The effort required to gather the information and produce a useful metric is not prohibitive and the information gained has verified long held feelings regarding network performance with real data

  18. Improving Patient Satisfaction Through Computer-Based Questionnaires.

    Science.gov (United States)

    Smith, Matthew J; Reiter, Michael J; Crist, Brett D; Schultz, Loren G; Choma, Theodore J

    2016-01-01

    Patient-reported outcome measures are helping clinicians to use evidence-based medicine in decision making. The use of computer-based questionnaires to gather such data may offer advantages over traditional paper-based methods. These advantages include consistent presentation, prompts for missed questions, reliable scoring, and simple and accurate transfer of information into databases without manual data entry. The authors enrolled 308 patients over a 16-month period from 3 orthopedic clinics: spine, upper extremity, and trauma. Patients were randomized to complete either electronic or paper validated outcome forms during their first visit, and they completed the opposite modality at their second visit, which was approximately 7 weeks later. For patients with upper-extremity injuries, the Penn Shoulder Score (PSS) was used. For patients with lower-extremity injuries, the Foot Function Index (FFI) was used. For patients with lumbar spine symptoms, the Oswestry Disability Index (ODI) was used. All patients also were asked to complete the 36-Item Short Form Health Survey (SF-36) Health Status Survey, version 1. The authors assessed patient satisfaction with each survey modality and determined potential advantages and disadvantages for each. No statistically significant differences were found between the paper and electronic versions for patient-reported outcome data. However, patients strongly preferred the electronic surveys. Additionally, the paper forms had significantly more missed questions for the FFI (P<.0001), ODI (P<.0001), and PSS (P=.008), and patents were significantly less likely to complete these forms (P<.0001). Future research should focus on limiting the burden on responders, individualizing forms and questions as much as possible, and offering alternative environments for completion (home or mobile platforms). Copyright 2016, SLACK Incorporated.

  19. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  2. Computer-based mechanical design of overhead lines

    Science.gov (United States)

    Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.

    2016-02-01

    Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface

  3. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  4. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  5. An E-learning System based on Affective Computing

    Science.gov (United States)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  6. Quantitative computed tomography measures of emphysema and airway wall thickness are related to respiratory symptoms

    DEFF Research Database (Denmark)

    Grydeland, Thomas B; Dirksen, Asger; Coxson, Harvey O

    2010-01-01

    There is limited knowledge about the relationship between respiratory symptoms and quantitative high-resolution computed tomography measures of emphysema and airway wall thickness.......There is limited knowledge about the relationship between respiratory symptoms and quantitative high-resolution computed tomography measures of emphysema and airway wall thickness....

  7. Measurement and reproduction accuracy of computer-controlled grand pianos

    Science.gov (United States)

    Goebl, Werner; Bresin, Roberto

    2003-10-01

    The recording and reproducing capabilities of a Yamaha Disklavier grand piano and a Bösendorfer SE290 computer-controlled grand piano were tested, with the goal of examining their reliability for performance research. An experimental setup consisting of accelerometers and a calibrated microphone was used to capture key and hammer movements, as well as the acoustic signal. Five selected keys were played by pianists with two types of touch (``staccato'' and ``legato''). Timing and dynamic differences between the original performance, the corresponding MIDI file recorded by the computer-controlled pianos, and its reproduction were analyzed. The two devices performed quite differently with respect to timing and dynamic accuracy. The Disklavier's onset capturing was slightly more precise (+/-10 ms) than its reproduction (-20 to +30 ms); the Bösendorfer performed generally better, but its timing accuracy was slightly less precise for recording (-10 to 3 ms) than for reproduction (+/-2 ms). Both devices exhibited a systematic (linear) error in recording over time. In the dynamic dimension, the Bösendorfer showed higher consistency over the whole dynamic range, while the Disklavier performed well only in a wide middle range. Neither device was able to capture or reproduce different types of touch.

  8. VMEbus based computer and real-time UNIX as infrastructure of DAQ

    International Nuclear Information System (INIS)

    Yasu, Y.; Fujii, H.; Nomachi, M.; Kodama, H.; Inoue, E.; Tajima, Y.; Takeuchi, Y.; Shimizu, Y.

    1994-01-01

    This paper describes what the authors have constructed as the infrastructure of data acquisition system (DAQ). The paper reports recent developments concerned with HP VME board computer with LynxOS (HP742rt/HP-RT) and Alpha/OSF1 with VMEbus adapter. The paper also reports current status of developing a Benchmark Suite for Data Acquisition (DAQBENCH) for measuring not only the performance of VME/CAMAC access but also that of the context switching, the inter-process communications and so on, for various computers including Workstation-based systems and VME board computers

  9. VARIABILITY OF MANUAL AND COMPUTERIZED METHODS FOR MEASURING CORONAL VERTEBRAL INCLINATION IN COMPUTED TOMOGRAPHY IMAGES

    Directory of Open Access Journals (Sweden)

    Tomaž Vrtovec

    2015-06-01

    Full Text Available Objective measurement of coronal vertebral inclination (CVI is of significant importance for evaluating spinal deformities in the coronal plane. The purpose of this study is to systematically analyze and compare manual and computerized measurements of CVI in cross-sectional and volumetric computed tomography (CT images. Three observers independently measured CVI in 14 CT images of normal and 14 CT images of scoliotic vertebrae by using six manual and two computerized measurements. Manual measurements were obtained in coronal cross-sections by manually identifying the vertebral body corners, which served to measure CVI according to the superior and inferior tangents, left and right tangents, and mid-endplate and mid-wall lines. Computerized measurements were obtained in two dimensions (2D and in three dimensions (3D by manually initializing an automated method in vertebral centroids and then searching for the planes of maximal symmetry of vertebral anatomical structures. The mid-endplate lines were the most reproducible and reliable manual measurements (intra- and inter-observer variability of 0.7° and 1.2° standard deviation, SD, respectively. The computerized measurements in 3D were more reproducible and reliable (intra- and inter-observer variability of 0.5° and 0.7° SD, respectively, but were most consistent with the mid-wall lines (2.0° SD and 1.4° mean absolute difference. The manual CVI measurements based on mid-endplate lines and the computerized CVI measurements in 3D resulted in the lowest intra-observer and inter-observer variability, however, computerized CVI measurements reduce observer interaction.

  10. Flux wire measurements in Cavalier for verifying computer code applications

    International Nuclear Information System (INIS)

    Fehr, M.; Stubbs, J.; Hosticka, B.

    1988-01-01

    The Cavalier and UVAR research reactors are to be converted from high-enrichment uranium (HEU) to low-enrichment uranium (LEU) fuel. As a first step, an extensive set of gold wire activation measurements has been taken on the Cavalier reactor. Axial traverses show internal consistency to the order of ±5%, while horizontal traverses show somewhat larger deviations. The activation measurements will be converted to flux measurements via the Thermos code and will then be used to verify the Leopard-2DB codes. The codes will ultimately be used to design an upgraded LEU core for the UVAR

  11. Centralized computer-based controls of the Nova Laser Facility

    International Nuclear Information System (INIS)

    Krammen, J.

    1985-01-01

    This article introduces the overall architecture of the computer-based Nova Laser Control System and describes its basic components. Use of standard hardware and software components ensures that the system, while specialized and distributed throughout the facility, is adaptable. 9 references, 6 figures

  12. An Intelligent Computer-Based System for Sign Language Tutoring

    Science.gov (United States)

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  13. Activity-based computing for medical work in hospitals

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    2009-01-01

    principles, the Java-based implementation of the ABC Framework, and an experimental evaluation together with a group of hospital clinicians. The article contributes to the growing research on support for human activities, mobility, collaboration, and context-aware computing. The ABC Framework presents...

  14. Students' Motivation toward Computer-Based Language Learning

    Science.gov (United States)

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  15. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  16. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  17. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  18. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  19. The Use of Audio and Animation in Computer Based Instruction.

    Science.gov (United States)

    Koroghlanian, Carol; Klein, James D.

    This study investigated the effects of audio, animation, and spatial ability in a computer-based instructional program for biology. The program presented instructional material via test or audio with lean text and included eight instructional sequences presented either via static illustrations or animations. High school students enrolled in a…

  20. Novel Ethernet Based Optical Local Area Networks for Computer Interconnection

    NARCIS (Netherlands)

    Radovanovic, Igor; van Etten, Wim; Taniman, R.O.; Kleinkiskamp, Ronny

    2003-01-01

    In this paper we present new optical local area networks for fiber-to-the-desk application. Presented networks are expected to bring a solution for having optical fibers all the way to computers. To bring the overall implementation costs down we have based our networks on short-wavelength optical

  1. The Accuracy of Cognitive Monitoring during Computer-Based Instruction.

    Science.gov (United States)

    Garhart, Casey; Hannafin, Michael J.

    This study was conducted to determine the accuracy of learners' comprehension monitoring during computer-based instruction and to assess the relationship between enroute monitoring and different levels of learning. Participants were 50 university undergraduate students enrolled in an introductory educational psychology class. All students received…

  2. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  3. The use of computer based instructions to enhance Rwandan ...

    African Journals Online (AJOL)

    Annestar

    (2) To what extent the newly acquired ICT skills impact on teachers' competency? (3) How suitable is computer based instruction to enhance teachers' continuous professional development? Literature review. ICT competency for teachers. Regardless of the quantity and quality of technology available in classrooms, the key ...

  4. ORGANIZATION OF CLOUD COMPUTING INFRASTRUCTURE BASED ON SDN NETWORK

    Directory of Open Access Journals (Sweden)

    Alexey A. Efimenko

    2013-01-01

    Full Text Available The article presents the main approaches to cloud computing infrastructure based on the SDN network in present data processing centers (DPC. The main indexes of management effectiveness of network infrastructure of DPC are determined. The examples of solutions for the creation of virtual network devices are provided.

  5. Discovery of technical methanation catalysts based on computational screening

    DEFF Research Database (Denmark)

    Sehested, Jens; Larsen, Kasper Emil; Kustov, Arkadii

    2007-01-01

    Methanation is a classical reaction in heterogeneous catalysis and significant effort has been put into improving the industrially preferred nickel-based catalysts. Recently, a computational screening study showed that nickel-iron alloys should be more active than the pure nickel catalyst and at ...

  6. A computer-based teaching programme (CBTP) developed for ...

    African Journals Online (AJOL)

    The nursing profession, like other professions, is focused on preparing students for practice, and particular attention must be paid to the ability of student nurses to extend their knowledge and to solve nursing care problems effectively. A computer-based teaching programme (CBTP) for clinical practice to achieve these ...

  7. Evaluation of computer-based library services at Kenneth Dike ...

    African Journals Online (AJOL)

    This study evaluated computer-based library services/routines at Kenneth Dike Library, University of Ibadan. Four research questions were developed and answered. A survey research design was adopted; using questionnaire as the instrument for data collection. A total of 200 respondents randomly selected from 10 ...

  8. A Computer-Based Instrument That Identifies Common Science Misconceptions

    Science.gov (United States)

    Larrabee, Timothy G.; Stein, Mary; Barman, Charles

    2006-01-01

    This article describes the rationale for and development of a computer-based instrument that helps identify commonly held science misconceptions. The instrument, known as the Science Beliefs Test, is a 47-item instrument that targets topics in chemistry, physics, biology, earth science, and astronomy. The use of an online data collection system…

  9. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  10. Computer Based Asset Management System For Commercial Banks

    Directory of Open Access Journals (Sweden)

    Amanze

    2015-08-01

    Full Text Available ABSTRACT The Computer-based Asset Management System is a web-based system. It allows commercial banks to keep track of their assets. The most advantages of this system are the effective management of asset by keeping records of the asset and retrieval of information. In this research I gather the information to define the requirements of the new application and look at factors how commercial banks managed their asset.

  11. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  12. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  13. Automated egg grading system using computer vision: Investigation on weight measure versus shape parameters

    Science.gov (United States)

    Nasir, Ahmad Fakhri Ab; Suhaila Sabarudin, Siti; Majeed, Anwar P. P. Abdul; Ghani, Ahmad Shahrizan Abdul

    2018-04-01

    Chicken egg is a source of food of high demand by humans. Human operators cannot work perfectly and continuously when conducting egg grading. Instead of an egg grading system using weight measure, an automatic system for egg grading using computer vision (using egg shape parameter) can be used to improve the productivity of egg grading. However, early hypothesis has indicated that more number of egg classes will change when using egg shape parameter compared with using weight measure. This paper presents the comparison of egg classification by the two above-mentioned methods. Firstly, 120 images of chicken eggs of various grades (A–D) produced in Malaysia are captured. Then, the egg images are processed using image pre-processing techniques, such as image cropping, smoothing and segmentation. Thereafter, eight egg shape features, including area, major axis length, minor axis length, volume, diameter and perimeter, are extracted. Lastly, feature selection (information gain ratio) and feature extraction (principal component analysis) are performed using k-nearest neighbour classifier in the classification process. Two methods, namely, supervised learning (using weight measure as graded by egg supplier) and unsupervised learning (using egg shape parameters as graded by ourselves), are conducted to execute the experiment. Clustering results reveal many changes in egg classes after performing shape-based grading. On average, the best recognition results using shape-based grading label is 94.16% while using weight-based label is 44.17%. As conclusion, automated egg grading system using computer vision is better by implementing shape-based features since it uses image meanwhile the weight parameter is more suitable by using weight grading system.

  14. An Examination of Unsteady Airloads on a UH-60A Rotor: Computation Versus Measurement

    Science.gov (United States)

    Biedron, Robert T.; Lee-Rausch, Elizabeth

    2012-01-01

    An unsteady Reynolds-averaged Navier-Stokes solver for unstructured grids is used to simulate the flow over a UH-60A rotor. Traditionally, the computed pressure and shear stresses are integrated on the computational mesh at selected radial stations and compared to measured airloads. However, the corresponding integration of experimental data uses only the pressure contribution, and the set of integration points (pressure taps) is modest compared to the computational mesh resolution. This paper examines the difference between the traditional integration of computed airloads and an integration consistent with that used for the experimental data. In addition, a comparison of chordwise pressure distributions between computation and measurement is made. Examination of this unsteady pressure data provides new opportunities to understand differences between computation and flight measurement.

  15. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  16. Quantum computing based on space states without charge transfer

    International Nuclear Information System (INIS)

    Vyurkov, V.; Filippov, S.; Gorelik, L.

    2010-01-01

    An implementation of a quantum computer based on space states in double quantum dots is discussed. There is no charge transfer in qubits during a calculation, therefore, uncontrolled entanglement between qubits due to long-range Coulomb interaction is suppressed. Encoding and processing of quantum information is merely performed on symmetric and antisymmetric states of the electron in double quantum dots. Other plausible sources of decoherence caused by interaction with phonons and gates could be substantially suppressed in the structure as well. We also demonstrate how all necessary quantum logic operations, initialization, writing, and read-out could be carried out in the computer.

  17. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  18. Environmental sciences and computations: a modular data based systems approach

    International Nuclear Information System (INIS)

    Crawford, T.V.; Bailey, C.E.

    1975-07-01

    A major computer code for environmental calculations is under development at the Savannah River Laboratory. The primary aim is to develop a flexible, efficient capability to calculate, for all significant pathways, the dose to man resulting from releases of radionuclides from the Savannah River Plant and from other existing and potential radioactive sources in the southeastern United States. The environmental sciences programs at SRP are described, with emphasis on the development of the calculational system. It is being developed as a modular data-based system within the framework of the larger JOSHUA Computer System, which provides data management, terminal, and job execution facilities. (U.S.)

  19. An Interactive Computer-Based Circulation System: Design and Development

    Directory of Open Access Journals (Sweden)

    James S. Aagaard

    1972-03-01

    Full Text Available An on-line computer-based circulation control system has been installed at the Northwestern University library. Features of the system include self-service book charge, remote terminal inquiry and update, and automatic production of notices for call-ins and books available. Fine notices are also prepared daily and overdue notices weekly. Important considerations in the design of the system were to minimize costs of operation and to include technical services functions eventually. The system operates on a relatively small computer in a multiprogrammed mode.

  20. Connection machine: a computer architecture based on cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hillis, W D

    1984-01-01

    This paper describes the connection machine, a programmable computer based on cellular automata. The essential idea behind the connection machine is that a regular locally-connected cellular array can be made to behave as if the processing cells are connected into any desired topology. When the topology of the machine is chosen to match the topology of the application program, the result is a fast, powerful computing engine. The connection machine was originally designed to implement knowledge retrieval operations in artificial intelligence programs, but the hardware and the programming techniques are apparently applicable to a much larger class of problems. A machine with 100000 processing cells is currently being constructed. 27 references.

  1. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  2. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  3. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.

    Science.gov (United States)

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-06-23

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

  4. Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System

    Science.gov (United States)

    Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia

    2013-01-01

    The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…

  5. The effect of dynamic workstations on the performance of various computer and office-based tasks

    NARCIS (Netherlands)

    Burford, E.M.; Botter, J.; Commissaris, D.; Könemann, R.; Hiemstra-Van Mastrigt, S.; Ellegast, R.P.

    2013-01-01

    The effect of different workstations, conventional and dynamic, on different types of performance measures for several different office and computer based task was investigated in this research paper. The two dynamic workstations assessed were the Lifespan Treadmill Desk and the RightAngle

  6. Item Difficulty in the Evaluation of Computer-Based Instruction: An Example from Neuroanatomy

    Science.gov (United States)

    Chariker, Julia H.; Naaz, Farah; Pani, John R.

    2012-01-01

    This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of…

  7. Computer-Assisted English Learning System Based on Free Conversation by Topic

    Science.gov (United States)

    Choi, Sung-Kwon; Kwon, Oh-Woog; Kim, Young-Kil

    2017-01-01

    This paper aims to describe a computer-assisted English learning system using chatbots and dialogue systems, which allow free conversation outside the topic without limiting the learner's flow of conversation. The evaluation was conducted by 20 experimenters. The performance of the system based on a free conversation by topic was measured by the…

  8. An USB-based time measurement system

    International Nuclear Information System (INIS)

    Qin Xi; Liu Shubin; An Qi

    2010-01-01

    In this paper,we report the electronics of a timing measurement system of PTB(portable TDC board), which is a handy tool based on USB interface, customized for high precision time measurements without any crates. The time digitization is based on the High Performance TDC Chip (HPTDC). The real-time compensation for HPTDC outputs and the USB master logic are implemented in an ALTERA's Cyclone FPGA. The architecture design and logic design are described in detail. Test of the system showed a time resolution of 13.3 ps. (authors)

  9. Measurement of normal ocular volume by the use of computed ...

    African Journals Online (AJOL)

    2012-09-03

    Sep 3, 2012 ... ocular axial length measurements from which ocular volume can be calculated. ... The CT scans were performed in axial planes at a thickness of 3 mm and a .... dimension assessing capability, it gives anatomic details .... have larger skeletal size and bone mass than females, despite comparable body size.

  10. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  11. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  12. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  13. Toward Measuring Network Aesthetics Based on Symmetry

    Directory of Open Access Journals (Sweden)

    Zengqiang Chen

    2017-05-01

    Full Text Available In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here we take a very different approach, abandoning reliance on geometrical properties, and apply information-theoretic measures to abstract graphs and networks directly (rather than to their visual representaions as a means of capturing classical appreciation of structural symmetry. Examples are used solely to motivate the approach to measurement, and to elucidate our symmetry-based mathematical theory of network aesthetics.

  14. Template based parallel checkpointing in a massively parallel computer system

    Science.gov (United States)

    Archer, Charles Jens [Rochester, MN; Inglett, Todd Alan [Rochester, MN

    2009-01-13

    A method and apparatus for a template based parallel checkpoint save for a massively parallel super computer system using a parallel variation of the rsync protocol, and network broadcast. In preferred embodiments, the checkpoint data for each node is compared to a template checkpoint file that resides in the storage and that was previously produced. Embodiments herein greatly decrease the amount of data that must be transmitted and stored for faster checkpointing and increased efficiency of the computer system. Embodiments are directed to a parallel computer system with nodes arranged in a cluster with a high speed interconnect that can perform broadcast communication. The checkpoint contains a set of actual small data blocks with their corresponding checksums from all nodes in the system. The data blocks may be compressed using conventional non-lossy data compression algorithms to further reduce the overall checkpoint size.

  15. Cluster-state quantum computing enhanced by high-fidelity generalized measurements.

    Science.gov (United States)

    Biggerstaff, D N; Kaltenbaek, R; Hamel, D R; Weihs, G; Rudolph, T; Resch, K J

    2009-12-11

    We introduce and implement a technique to extend the quantum computational power of cluster states by replacing some projective measurements with generalized quantum measurements (POVMs). As an experimental demonstration we fully realize an arbitrary three-qubit cluster computation by implementing a tunable linear-optical POVM, as well as fast active feedforward, on a two-qubit photonic cluster state. Over 206 different computations, the average output fidelity is 0.9832+/-0.0002; furthermore the error contribution from our POVM device and feedforward is only of O(10(-3)), less than some recent thresholds for fault-tolerant cluster computing.

  16. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  17. Accuracy of magnetic resonance based susceptibility measurements

    Science.gov (United States)

    Erdevig, Hannah E.; Russek, Stephen E.; Carnicka, Slavka; Stupic, Karl F.; Keenan, Kathryn E.

    2017-05-01

    Magnetic Resonance Imaging (MRI) is increasingly used to map the magnetic susceptibility of tissue to identify cerebral microbleeds associated with traumatic brain injury and pathological iron deposits associated with neurodegenerative diseases such as Parkinson's and Alzheimer's disease. Accurate measurements of susceptibility are important for determining oxygen and iron content in blood vessels and brain tissue for use in noninvasive clinical diagnosis and treatment assessments. Induced magnetic fields with amplitude on the order of 100 nT, can be detected using MRI phase images. The induced field distributions can then be inverted to obtain quantitative susceptibility maps. The focus of this research was to determine the accuracy of MRI-based susceptibility measurements using simple phantom geometries and to compare the susceptibility measurements with magnetometry measurements where SI-traceable standards are available. The susceptibilities of paramagnetic salt solutions in cylindrical containers were measured as a function of orientation relative to the static MRI field. The observed induced fields as a function of orientation of the cylinder were in good agreement with simple models. The MRI susceptibility measurements were compared with SQUID magnetometry using NIST-traceable standards. MRI can accurately measure relative magnetic susceptibilities while SQUID magnetometry measures absolute magnetic susceptibility. Given the accuracy of moment measurements of tissue mimicking samples, and the need to look at small differences in tissue properties, the use of existing NIST standard reference materials to calibrate MRI reference structures is problematic and better reference materials are required.

  18. Statistical x-ray computed tomography imaging from photon-starved measurements

    Science.gov (United States)

    Chang, Zhiqian; Zhang, Ruoqiao; Thibault, Jean-Baptiste; Sauer, Ken; Bouman, Charles

    2013-03-01

    Dose reduction in clinical X-ray computed tomography (CT) causes low signal-to-noise ratio (SNR) in photonsparse situations. Statistical iterative reconstruction algorithms have the advantage of retaining image quality while reducing input dosage, but they meet their limits of practicality when significant portions of the sinogram near photon starvation. The corruption of electronic noise leads to measured photon counts taking on negative values, posing a problem for the log() operation in preprocessing of data. In this paper, we propose two categories of projection correction methods: an adaptive denoising filter and Bayesian inference. The denoising filter is easy to implement and preserves local statistics, but it introduces correlation between channels and may affect image resolution. Bayesian inference is a point-wise estimation based on measurements and prior information. Both approaches help improve diagnostic image quality at dramatically reduced dosage.

  19. Computer Generated Hologram System for Wavefront Measurement System Calibration

    Science.gov (United States)

    Olczak, Gene

    2011-01-01

    Computer Generated Holograms (CGHs) have been used for some time to calibrate interferometers that require nulling optics. A typical scenario is the testing of aspheric surfaces with an interferometer placed near the paraxial center of curvature. Existing CGH technology suffers from a reduced capacity to calibrate middle and high spatial frequencies. The root cause of this shortcoming is as follows: the CGH is not placed at an image conjugate of the asphere due to limitations imposed by the geometry of the test and the allowable size of the CGH. This innovation provides a calibration system where the imaging properties in calibration can be made comparable to the test configuration. Thus, if the test is designed to have good imaging properties, then middle and high spatial frequency errors in the test system can be well calibrated. The improved imaging properties are provided by a rudimentary auxiliary optic as part of the calibration system. The auxiliary optic is simple to characterize and align to the CGH. Use of the auxiliary optic also reduces the size of the CGH required for calibration and the density of the lines required for the CGH. The resulting CGH is less expensive than the existing technology and has reduced write error and alignment error sensitivities. This CGH system is suitable for any kind of calibration using an interferometer when high spatial resolution is required. It is especially well suited for tests that include segmented optical components or large apertures.

  20. Detection of Mild Emphysema by Computed Tomography Density Measurements

    International Nuclear Information System (INIS)

    Vikgren, J.; Friman, O.; Borga, M.; Boijsen, M.; Gustavsson, S.; Bake, B.; Tylen, U.; Ekberg-Jansson, A.

    2005-01-01

    Purpose: To assess the ability of a conventional density mask method to detect mild emphysema by high-resolution computed tomography (HRCT); to analyze factors influencing quantification of mild emphysema; and to validate a new algorithm for detection of mild emphysema. Material and Methods: Fifty-five healthy male smokers and 34 never-smokers, 61-62 years of age, were examined. Emphysema was evaluated visually, by the conventional density mask method, and by a new algorithm compensating for the effects of gravity and artifacts due to motion and the reconstruction algorithm. Effects of the reconstruction algorithm, slice thickness, and various threshold levels on the outcome of the density mask area were evaluated. Results: Forty-nine percent of the smokers had mild emphysema. The density mask area was higher the thinner the slice irrespective of the reconstruction algorithm and threshold level. The sharp algorithm resulted in increased density mask area. The new reconstruction algorithm could discriminate between smokers with and those without mild emphysema, whereas the density mask method could not. The diagnostic ability of the new algorithm was dependent on lung level. At about 90% specificity, sensitivity was 65-100% in the apical levels, but low in the rest of the lung. Conclusion: The conventional density mask method is inadequate for detecting mild emphysema, while the new algorithm improves the diagnostic ability but is nevertheless still imperfect

  1. Depth Measurement Based on Infrared Coded Structured Light

    Directory of Open Access Journals (Sweden)

    Tong Jia

    2014-01-01

    Full Text Available Depth measurement is a challenging problem in computer vision research. In this study, we first design a new grid pattern and develop a sequence coding and decoding algorithm to process the pattern. Second, we propose a linear fitting algorithm to derive the linear relationship between the object depth and pixel shift. Third, we obtain depth information on an object based on this linear relationship. Moreover, 3D reconstruction is implemented based on Delaunay triangulation algorithm. Finally, we utilize the regularity of the error curves to correct the system errors and improve the measurement accuracy. The experimental results show that the accuracy of depth measurement is related to the step length of moving object.

  2. Visual Peoplemeter: A Vision-based Television Audience Measurement System

    Directory of Open Access Journals (Sweden)

    SKELIN, A. K.

    2014-11-01

    Full Text Available Visual peoplemeter is a vision-based measurement system that objectively evaluates the attentive behavior for TV audience rating, thus offering solution to some of drawbacks of current manual logging peoplemeters. In this paper, some limitations of current audience measurement system are reviewed and a novel vision-based system aiming at passive metering of viewers is prototyped. The system uses camera mounted on a television as a sensing modality and applies advanced computer vision algorithms to detect and track a person, and to recognize attentional states. Feasibility of the system is evaluated on a secondary dataset. The results show that the proposed system can analyze viewer's attentive behavior, therefore enabling passive estimates of relevant audience measurement categories.

  3. Parallel processing using an optical delay-based reservoir computer

    Science.gov (United States)

    Van der Sande, Guy; Nguimdo, Romain Modeste; Verschaffelt, Guy

    2016-04-01

    Delay systems subject to delayed optical feedback have recently shown great potential in solving computationally hard tasks. By implementing a neuro-inspired computational scheme relying on the transient response to optical data injection, high processing speeds have been demonstrated. However, reservoir computing systems based on delay dynamics discussed in the literature are designed by coupling many different stand-alone components which lead to bulky, lack of long-term stability, non-monolithic systems. Here we numerically investigate the possibility of implementing reservoir computing schemes based on semiconductor ring lasers. Semiconductor ring lasers are semiconductor lasers where the laser cavity consists of a ring-shaped waveguide. SRLs are highly integrable and scalable, making them ideal candidates for key components in photonic integrated circuits. SRLs can generate light in two counterpropagating directions between which bistability has been demonstrated. We demonstrate that two independent machine learning tasks , even with different nature of inputs with different input data signals can be simultaneously computed using a single photonic nonlinear node relying on the parallelism offered by photonics. We illustrate the performance on simultaneous chaotic time series prediction and a classification of the Nonlinear Channel Equalization. We take advantage of different directional modes to process individual tasks. Each directional mode processes one individual task to mitigate possible crosstalk between the tasks. Our results indicate that prediction/classification with errors comparable to the state-of-the-art performance can be obtained even with noise despite the two tasks being computed simultaneously. We also find that a good performance is obtained for both tasks for a broad range of the parameters. The results are discussed in detail in [Nguimdo et al., IEEE Trans. Neural Netw. Learn. Syst. 26, pp. 3301-3307, 2015

  4. Evaluation of the Relationship between Literacy and Mathematics Skills as Assessed by Curriculum-Based Measures

    Science.gov (United States)

    Rutherford-Becker, Kristy J.; Vanderwood, Michael L.

    2009-01-01

    The purpose of this study was to evaluate the extent that reading performance (as measured by curriculum-based measures [CBM] of oral reading fluency [ORF] and Maze reading comprehension), is related to math performance (as measured by CBM math computation and applied math). Additionally, this study examined which of the two reading measures was a…

  5. Development of a computer-based pulsed NMR thermometer

    International Nuclear Information System (INIS)

    Hobeika, Alexandre; Haard, T.M.; Hoskinson, E.M.; Packard, R.E.

    2003-01-01

    We have designed a fully computer-controlled pulsed NMR system, using the National Instruments PCI-6115 data acquisition board. We use it for millikelvin thermometry and have developed a special control program, written in LabVIEW, for this purpose. It can perform measurements of temperature via the susceptibility or the τ 1 dependence. This system requires little hardware, which makes it very versatile, easily reproducible and customizable

  6. Trend of computer-based console for nuclear power plants

    International Nuclear Information System (INIS)

    Wajima, Tsunetaka; Serizawa, Michiya

    1975-01-01

    The amount of informations to be watched by the operators in the central operation room increased with the increase of the capacity of nuclear power generation plants, and the necessity of computer-based consoles, in which the informations are compiled and the rationalization of the interface between the operators and the plants is intended by introducing CRT displays and process computers, became to be recognized. The integrated monitoring and controlling system is explained briefly by taking Dungeness B Nuclear Power Station in Britain as a typical example. This power station comprises two AGRs, and these two plants can be controlled in one central control room, each by one man. Three computers including stand-by one are installed. Each computer has the core memory of 16 K words (24 bits/word), and 4 magnetic drums of 256 K words are installed as the external memory. The peripheral equipments are 12 CRT displays, 6 typewriters, high speed tape reader and tape punch for each plant. The display and record of plant data, the analysis, display and record of alarms, the control of plants including reactors, and post incident record are assigned to the computers. In Hitachi Ltd. in Japan, the introduction of color CRTs, the developments of operating consoles, new data-accessing method, and the consoles for maintenance management are in progress. (Kako, I.)

  7. Measurement of maxillary sinus volume using Computed Tomography

    International Nuclear Information System (INIS)

    Park, Chang Hee; Kim, Kee Deog; Park, Chang Seo

    2000-01-01

    To propose a standard value for the maxillary sinus volume of a normal Korean adult by measuring the width and height of the sinus and analyzing their correlation and the difference of the sinus size respectively between sexes, and on the right and left sides. Fifty-two (95 maxillary sinuses) out of 20 years or over aged patients who had taken CT in the Department of Dental Radiology, Yonsei University, Dental Hospital, between February 1997 and July 1999 who were no specific symptom, prominent bony septa, pathosis, clinical asymmetry and history of surgery in the maxillary sinus were retrospectively analyzed. The mean transverse width, antero-posterior width, height and volume of the normal Korean adult's maxillary sinuses were 28.33 mm, 39.69 mm, 46.60 mm and 21.90 cm 3 , respectively. There was a significant sex difference in the sinus volume (p<0.05). In the mean antero-posterior width, height and volume of the sinus, no significant difference was observed between both sides. All four measurements showed a significant correlation between both sides (p<0.0001). The widths and height of the sinus all showed a significant correlation with the sinus volume (p<0.0001). In the Korean normal adult's maxillary sinus, males tended to be larger than females. Except for the transverse width, all of the measurements showed no significant difference between the right and left side, but significant correlations in the four measurements between both sides were observed. Thus, the overgrowth or undergrowth in the unilateral maxillary sinus may suggest a certain pathosis or developmental abnormalities in the maxillary sinus.

  8. Measurement of breast tissue composition with dual energy cone-beam computed tomography: A postmortem study

    Energy Technology Data Exchange (ETDEWEB)

    Ding Huanjun; Ducote, Justin L.; Molloi, Sabee [Department of Radiological Sciences, University of California, Irvine, California 92697 (United States)

    2013-06-15

    Purpose: To investigate the feasibility of a three-material compositional measurement of water, lipid, and protein content of breast tissue with dual kVp cone-beam computed tomography (CT) for diagnostic purposes. Methods: Simulations were performed on a flat panel-based computed tomography system with a dual kVp technique in order to guide the selection of experimental acquisition parameters. The expected errors induced by using the proposed calibration materials were also estimated by simulation. Twenty pairs of postmortem breast samples were imaged with a flat-panel based dual kVp cone-beam CT system, followed by image-based material decomposition using calibration data obtained from a three-material phantom consisting of water, vegetable oil, and polyoxymethylene plastic. The tissue samples were then chemically decomposed into their respective water, lipid, and protein contents after imaging to allow direct comparison with data from dual energy decomposition. Results: Guided by results from simulation, the beam energies for the dual kVp cone-beam CT system were selected to be 50 and 120 kVp with the mean glandular dose divided equally between each exposure. The simulation also suggested that the use of polyoxymethylene as the calibration material for the measurement of pure protein may introduce an error of -11.0%. However, the tissue decomposition experiments, which employed a calibration phantom made out of water, oil, and polyoxymethylene, exhibited strong correlation with data from the chemical analysis. The average root-mean-square percentage error for water, lipid, and protein contents was 3.58% as compared with chemical analysis. Conclusions: The results of this study suggest that the water, lipid, and protein contents can be accurately measured using dual kVp cone-beam CT. The tissue compositional information may improve the sensitivity and specificity for breast cancer diagnosis.

  9. Computer-animated stimuli to measure motion sensitivity: constraints on signal design in the Jacky dragon.

    Science.gov (United States)

    Woo, Kevin L; Rieucau, Guillaume; Burke, Darren

    2017-02-01

    Identifying perceptual thresholds is critical for understanding the mechanisms that underlie signal evolution. Using computer-animated stimuli, we examined visual speed sensitivity in the Jacky dragon Amphibolurus muricatus , a species that makes extensive use of rapid motor patterns in social communication. First, focal lizards were tested in discrimination trials using random-dot kinematograms displaying combinations of speed, coherence, and direction. Second, we measured subject lizards' ability to predict the appearance of a secondary reinforcer (1 of 3 different computer-generated animations of invertebrates: cricket, spider, and mite) based on the direction of movement of a field of drifting dots by following a set of behavioural responses (e.g., orienting response, latency to respond) to our virtual stimuli. We found an effect of both speed and coherence, as well as an interaction between these 2 factors on the perception of moving stimuli. Overall, our results showed that Jacky dragons have acute sensitivity to high speeds. We then employed an optic flow analysis to match the performance to ecologically relevant motion. Our results suggest that the Jacky dragon visual system may have been shaped to detect fast motion. This pre-existing sensitivity may have constrained the evolution of conspecific displays. In contrast, Jacky dragons may have difficulty in detecting the movement of ambush predators, such as snakes and of some invertebrate prey. Our study also demonstrates the potential of the computer-animated stimuli technique for conducting nonintrusive tests to explore motion range and sensitivity in a visually mediated species.

  10. A reliable and valid questionnaire was developed to measure computer vision syndrome at the workplace.

    Science.gov (United States)

    Seguí, María del Mar; Cabrero-García, Julio; Crespo, Ana; Verdú, José; Ronda, Elena

    2015-06-01

    To design and validate a questionnaire to measure visual symptoms related to exposure to computers in the workplace. Our computer vision syndrome questionnaire (CVS-Q) was based on a literature review and validated through discussion with experts and performance of a pretest, pilot test, and retest. Content validity was evaluated by occupational health, optometry, and ophthalmology experts. Rasch analysis was used in the psychometric evaluation of the questionnaire. Criterion validity was determined by calculating the sensitivity and specificity, receiver operator characteristic curve, and cutoff point. Test-retest repeatability was tested using the intraclass correlation coefficient (ICC) and concordance by Cohen's kappa (κ). The CVS-Q was developed with wide consensus among experts and was well accepted by the target group. It assesses the frequency and intensity of 16 symptoms using a single rating scale (symptom severity) that fits the Rasch rating scale model well. The questionnaire has sensitivity and specificity over 70% and achieved good test-retest repeatability both for the scores obtained [ICC = 0.802; 95% confidence interval (CI): 0.673, 0.884] and CVS classification (κ = 0.612; 95% CI: 0.384, 0.839). The CVS-Q has acceptable psychometric properties, making it a valid and reliable tool to control the visual health of computer workers, and can potentially be used in clinical trials and outcome research. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS

    International Nuclear Information System (INIS)

    Li Deming

    2001-01-01

    Micro-computer based control system and its software for carrying out the sequential acceleration on SMCAMS is described. Also, the establishment of the 14 C particle measuring device and the improvement of the original power supply system are described

  12. Application of mobile computers in a measuring system supporting examination of posture diseases

    Science.gov (United States)

    Piekarski, Jacek; Klimiec, Ewa; Zaraska, Wiesław

    2013-07-01

    Measuring system designed and manufactured by the authors and based on mobile computers (smartphones and tablets) working as data recorders has been invented to support diagnosis of orthopedic, especially feet, diseases. The basic idea is to examine a patient in his natural environment, during the usual activities (such as walking or running). The paper describes the proposed system with sensors manufactured from piezoelectric film (PVDF film) and placed in the shoe insole. The mechanical reliability of PVDF film is excellent, though elimination of the pyroelectric effect is required. A possible solution of the problem and the test results are presented in the paper. Data recording is based on wireless transmission to a mobile device used as a data logger.

  13. Replacement of traditional lectures with computer-based tutorials: a case study

    Directory of Open Access Journals (Sweden)

    Derek Lavelle

    1996-12-01

    Full Text Available This paper reports on a pilot project with a group of 60 second-year undergraduates studying the use of standard forms of contract in the construction industry. The project entailed the replacement of two of a series of nine scheduled lectures with a computer-based tutorial. The two main aims of the project were to test the viability of converting existing lecture material into computer-based material on an in-house production basis, and to obtain feedback from the student cohort on their behavioural response to the change in media. The effect on student performance was not measured at this stage of development.

  14. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  15. The physics of teams: Interdependence, measurable entropy and computational emotion

    Science.gov (United States)

    Lawless, William F.

    2017-08-01

    -like models capture some of the essential aspects of interdependence, a tool for the metrics of hybrid teams; as an example, we find additional support for our model of the solution to the open problem of team size. We also report on progress with the theory of computational emotion for hybrid teams, linking it qualitatively to the second law of thermodynamics. We conclude that the science of interdependence

  16. The Physics of Teams: Interdependence, Measurable Entropy, and Computational Emotion

    Directory of Open Access Journals (Sweden)

    William F. Lawless

    2017-08-01

    how our quantum-like models capture some of the essential aspects of interdependence, a tool for the metrics of hybrid teams; as an example, we find additional support for our model of the solution to the open problem of team size. We also report on progress with the theory of computational emotion for hybrid teams, linking it qualitatively to the second law of thermodynamics. We conclude that the science of interdependence advances the science of hybrid teams.

  17. Industrial application of a graphics computer-based training system

    International Nuclear Information System (INIS)

    Klemm, R.W.

    1985-01-01

    Graphics Computer Based Training (GCBT) roles include drilling, tutoring, simulation and problem solving. Of these, Commonwealth Edison uses mainly tutoring, simulation and problem solving. These roles are not separate in any particular program. They are integrated to provide tutoring and part-task simulation, part-task simulation and problem solving, or problem solving tutoring. Commonwealth's Graphics Computer Based Training program was a result of over a year's worth of research and planning. The keys to the program are it's flexibility and control. Flexibility is maintained through stand alone units capable of program authoring and modification for plant/site specific users. Yet, the system has the capability to support up to 31 terminals with a 40 mb hard disk drive. Control of the GCBT program is accomplished through establishment of development priorities and a central development facility (Commonwealth Edison's Production Training Center)

  18. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  19. Security personnel training using a computer-based game

    International Nuclear Information System (INIS)

    Ralph, J.; Bickner, L.

    1987-01-01

    Security personnel training is an integral part of a total physical security program, and is essential in enabling security personnel to perform their function effectively. Several training tools are currently available for use by security supervisors, including: textbook study, classroom instruction, and live simulations. However, due to shortcomings inherent in each of these tools, a need exists for the development of low-cost alternative training methods. This paper discusses one such alternative: a computer-based, game-type security training system. This system would be based on a personal computer with high-resolution graphics. Key features of this system include: a high degree of realism; flexibility in use and maintenance; high trainee motivation; and low cost

  20. Could one make a diamond-based quantum computer?

    International Nuclear Information System (INIS)

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-01-01

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  1. Package for the BESM-6 computer for particles momenta measuring in nuclei emulsions by semiautomatic microscope

    International Nuclear Information System (INIS)

    Leskin, V.A.; Saltykov, A.I.; Shabratova, G.S.

    1980-01-01

    Computer codes for using on the BESM-6 computer have been developed. The information obtained by semiautomatic measuring in nuclear emulsions are processed, and then the information from paper tape are checked and the diagnostics are printed if the errors in the information occu.,. Data input to the BESM-6 computer is written to the magnetic tape as the direct access files. The data not containing errors are used in calculations of particle momentum by multiple-scattering method

  2. A computational approach to measuring the correlation between expertise and social media influence for celebrities on microblogs

    OpenAIRE

    Zhao, Wayne Xin; Liu, Jing; He, Yulan; Lin, Chin Yew; Wen, Ji-Rong

    2016-01-01

    Social media influence analysis, sometimes also called authority detection, aims to rank users based on their influence scores in social media. Existing approaches of social influence analysis usually focus on how to develop effective algorithms to quantize users’ influence scores. They rarely consider a person’s expertise levels which are arguably important to influence measures. In this paper, we propose a computational approach to measuring the correlation between expertise and social medi...

  3. A novel method to measure femoral component migration by computed tomography: a cadaver study.

    Science.gov (United States)

    Boettner, Friedrich; Sculco, Peter; Lipman, Joseph; Renner, Lisa; Faschingbauer, Martin

    2016-06-01

    Radiostereometric analysis (RSA) is the most accurate technique to measure implant migration. However, it requires special equipment, technical expertise and analysis software and has not gained wide acceptance. The current paper analyzes a novel method to measure implant migration utilizing widely available computer tomography (CT). Three uncemented total hip replacements were performed in three human cadavers and six tantalum beads were inserted into the femoral bone similar to RSA. Six different 28 mm heads (-3, 0, 2.5, 5.0, 7.5 and 10 mm) were added to simulate five reproducible translations (maximum total point migration) of the center of the head. Implant migration was measured in a 3-D analysis software (Geomagic Studio 7). Repeat manual reconstructions of the center of the head were performed by two investigators to determine repeatability and accuracy. The accuracy of measurements between the centers of two head sizes was 0.11 mm with a CI 95 % of 0.22 mm. The intra-observer repeatability was 0.13 mm (CI 95 % 0.25 mm). The interrater-reliability was 0.943. CT based measurement of head displacement in a cadaver model were highly accurate and reproducible.

  4. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  5. MCPLOTS: a particle physics resource based on volunteer computing

    CERN Document Server

    Karneyeu, A; Prestel, S; Skands, P Z

    2014-01-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.

  6. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  7. A scalable PC-based parallel computer for lattice QCD

    International Nuclear Information System (INIS)

    Fodor, Z.; Katz, S.D.; Pappa, G.

    2003-01-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eoetvoes Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered (wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop

  8. A scalable PC-based parallel computer for lattice QCD

    International Nuclear Information System (INIS)

    Fodor, Z.; Papp, G.

    2002-09-01

    A PC-based parallel computer for medium/large scale lattice QCD simulations is suggested. The Eoetvoes Univ., Inst. Theor. Phys. cluster consists of 137 Intel P4-1.7 GHz nodes. Gigabit Ethernet cards are used for nearest neighbor communication in a two-dimensional mesh. The sustained performance for dynamical staggered(wilson) quarks on large lattices is around 70(110) GFlops. The exceptional price/performance ratio is below $1/Mflop. (orig.)

  9. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  10. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  11. MCPLOTS. A particle physics resource based on volunteer computing

    Energy Technology Data Exchange (ETDEWEB)

    Karneyeu, A. [Joint Inst. for Nuclear Research, Moscow (Russian Federation); Mijovic, L. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Irfu/SPP, CEA-Saclay, Gif-sur-Yvette (France); Prestel, S. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Lund Univ. (Sweden). Dept. of Astronomy and Theoretical Physics; Skands, P.Z. [European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2013-07-15

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  12. MCPLOTS: a particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P.Z.

    2014-01-01

    The mcplots.cern.ch web site (mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc rate at home 2.0 platform. (orig.)

  13. Gamma spectrometric system based on the personal computer Pravetz-83

    International Nuclear Information System (INIS)

    Yanakiev, K; Grigorov, T.; Vuchkov, M.

    1985-01-01

    A gamma spectrometric system based on a personal microcomputer Pravets-85 is described. The analog modules are NIM standard. ADC data are stored in the memory of the computer via a DMA channel and a real-time data processing is possible. The results from a series of tests indicate that the performance of the system is comparable with that of comercially avalable computerized spectrometers Ortec and Canberra

  14. ARGOS-NT: A computer based emergency management system

    International Nuclear Information System (INIS)

    Hoe, S.; Thykier-Nielsen, S.; Steffensen, L.B.

    2000-01-01

    In case of a nuclear accident or a threat of a release the Danish Emergency Management Agency is responsible for actions to minimize the consequences in Danish territory. To provide an overview of the situation, a computer based system called ARGOS-NT has been developed in 1993/94. This paper gives an overview of the system with emphasis on the prognostic part of the system. An example calculation shows the importance of correct landscape modeling. (author)

  15. MCPLOTS. A particle physics resource based on volunteer computing

    International Nuclear Information System (INIS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.

    2013-07-01

    The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC rate at HOME 2.0 platform.

  16. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    OpenAIRE

    Cheremisina Lyubov

    2014-01-01

    This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the s...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. Teaching advance care planning to medical students with a computer-based decision aid.

    Science.gov (United States)

    Green, Michael J; Levi, Benjamin H

    2011-03-01

    Discussing end-of-life decisions with cancer patients is a crucial skill for physicians. This article reports findings from a pilot study evaluating the effectiveness of a computer-based decision aid for teaching medical students about advance care planning. Second-year medical students at a single medical school were randomized to use a standard advance directive or a computer-based decision aid to help patients with advance care planning. Students' knowledge, skills, and satisfaction were measured by self-report; their performance was rated by patients. 121/133 (91%) of students participated. The Decision-Aid Group (n = 60) outperformed the Standard Group (n = 61) in terms of students' knowledge (p satisfaction with their learning experience (p student performance. Use of a computer-based decision aid may be an effective way to teach medical students how to discuss advance care planning with cancer patients.

  20. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.