WorldWideScience

Sample records for computing spatialimpulse responses

  1. Moral Responsibility and Computer Technology.

    Science.gov (United States)

    Friedman, Batya

    Noting a recent increase in the number of cases of computer crime and computer piracy, this paper takes up the question, "How can understanding the social context of computing help us--as parents, educators, and members of government and industry--to educate young people to become morally responsible members of an electronic information…

  2. A rigorous computational approach to linear response

    Science.gov (United States)

    Bahsoun, Wael; Galatolo, Stefano; Nisoli, Isaia; Niu, Xiaolong

    2018-03-01

    We present a general setting in which the formula describing the linear response of the physical measure of a perturbed system can be obtained. In this general setting we obtain an algorithm to rigorously compute the linear response. We apply our results to expanding circle maps. In particular, we present examples where we compute, up to a pre-specified error in the L∞ -norm, the response of expanding circle maps under stochastic and deterministic perturbations. Moreover, we present an example where we compute, up to a pre-specified error in the L 1-norm, the response of the intermittent family at the boundary; i.e. when the unperturbed system is the doubling map. This work was mainly conducted during a visit of SG to Loughborough University. WB and SG would like to thank The Leverhulme Trust for supporting mutual research visits through the Network Grant IN-2014-021. SG thanks the Department of Mathematical Sciences at Loughborough University for hospitality. WB thanks Dipartimento di Matematica, Universita di Pisa. The research of SG and IN is partially supported by EU Marie-Curie IRSES ‘Brazilian-European partnership in Dynamical Systems’ (FP7-PEOPLE-2012-IRSES 318999 BREUDS). IN was partially supported by CNPq and FAPERJ. IN would like to thank the Department of Mathematics at Uppsala University and the support of the KAW grant 2013.0315.

  3. Ethical Responsibility Key to Computer Security.

    Science.gov (United States)

    Lynn, M. Stuart

    1989-01-01

    The pervasiveness of powerful computers and computer networks has raised the specter of new forms of abuse and of concomitant ethical issues. Blurred boundaries, hackers, the Computer Worm, ethical issues, and implications for academic institutions are discussed. (MLW)

  4. Response time accuracy in Apple Macintosh computers.

    Science.gov (United States)

    Neath, Ian; Earle, Avery; Hallett, Darcy; Surprenant, Aimée M

    2011-06-01

    The accuracy and variability of response times (RTs) collected on stock Apple Macintosh computers using USB keyboards was assessed. A photodiode detected a change in the screen's luminosity and triggered a solenoid that pressed a key on the keyboard. The RTs collected in this way were reliable, but could be as much as 100 ms too long. The standard deviation of the measured RTs varied between 2.5 and 10 ms, and the distributions approximated a normal distribution. Surprisingly, two recent Apple-branded USB keyboards differed in their accuracy by as much as 20 ms. The most accurate RTs were collected when an external CRT was used to display the stimuli and Psychtoolbox was able to synchronize presentation with the screen refresh. We conclude that RTs collected on stock iMacs can detect a difference as small as 5-10 ms under realistic conditions, and this dictates which types of research should or should not use these systems.

  5. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  6. Ethics and computing living responsibly in a computerized world

    CERN Document Server

    2001-01-01

    "Ethics and Computing, Second Edition promotes awareness of major issues and accepted procedures and policies in the area of ethics and computing using real-world companies, incidents, products and people." "Ethics and Computing, Second Edition is for topical undergraduate courses with chapters and assignments designed to encourage critical thinking and informed ethical decisions. Furthermore, this book will keep abreast computer science, computer engineering, and information systems professionals and their colleagues of current ethical issues and responsibilities."--Jacket.

  7. Prerequisites for building a computer security incident response capability

    CSIR Research Space (South Africa)

    Mooi, M

    2015-08-01

    Full Text Available There are a number of considerations before one can commence with establishing a Computer Security Incident Response Team (CSIRT). This paper presents the results of a structured literature review investigating the business requirements...

  8. Computer security incident response team effectiveness : A needs assessment

    NARCIS (Netherlands)

    Kleij, R. van der; Kleinhuis, G.; Young, H.J.

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad-hoc basis, in close cooperation with other teams, and in

  9. Response Surface Model Building Using Orthogonal Arrays for Computer Experiments

    Science.gov (United States)

    Unal, Resit; Braun, Robert D.; Moore, Arlene A.; Lepsch, Roger A.

    1997-01-01

    This study investigates response surface methods for computer experiments and discusses some of the approaches available. Orthogonal arrays constructed for computer experiments are studied and an example application to a technology selection and optimization study for a reusable launch vehicle is presented.

  10. Computer incident response and forensics team management conducting a successful incident response

    CERN Document Server

    Johnson, Leighton

    2013-01-01

    Computer Incident Response and Forensics Team Management provides security professionals with a complete handbook of computer incident response from the perspective of forensics team management. This unique approach teaches readers the concepts and principles they need to conduct a successful incident response investigation, ensuring that proven policies and procedures are established and followed by all team members. Leighton R. Johnson III describes the processes within an incident response event and shows the crucial importance of skillful forensics team management, including when and where the transition to forensics investigation should occur during an incident response event. The book also provides discussions of key incident response components. Provides readers with a complete handbook on computer incident response from the perspective of forensics team management Identify the key steps to completing a successful computer incident response investigation Defines the qualities necessary to become a succ...

  11. Computational method for discovery of estrogen responsive genes

    DEFF Research Database (Denmark)

    Tang, Suisheng; Tan, Sin Lam; Ramadoss, Suresh Kumar

    2004-01-01

    of human genes are functionally well characterized. It is still unclear how many and which human genes respond to estrogen treatment. We propose a simple, economic, yet effective computational method to predict a subclass of estrogen responsive genes. Our method relies on the similarity of ERE frames...

  12. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  13. Computations of nuclear response functions with MACK-IV

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M A; Gohar, Y

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications.

  14. Computations of nuclear response functions with MACK-IV

    International Nuclear Information System (INIS)

    Abdou, M.A.; Gohar, Y.

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications

  15. Computer model of cardiovascular control system responses to exercise

    Science.gov (United States)

    Croston, R. C.; Rummel, J. A.; Kay, F. J.

    1973-01-01

    Approaches of systems analysis and mathematical modeling together with computer simulation techniques are applied to the cardiovascular system in order to simulate dynamic responses of the system to a range of exercise work loads. A block diagram of the circulatory model is presented, taking into account arterial segments, venous segments, arterio-venous circulation branches, and the heart. A cardiovascular control system model is also discussed together with model test results.

  16. Therapy response evaluation with positron emission tomography-computed tomography.

    Science.gov (United States)

    Segall, George M

    2010-12-01

    Positron emission tomography-computed tomography with F-18-fluorodeoxyglucose is widely used for evaluation of therapy response in patients with solid tumors but has not been as readily adopted in clinical trials because of the variability of acquisition and processing protocols and the absence of universal response criteria. Criteria proposed for clinical trials are difficult to apply in clinical practice, and gestalt impression is probably accurate in individual patients, especially with respect to the presence of progressive disease and complete response. Semiquantitative methods of determining tissue glucose metabolism, such as standard uptake value, can be a useful descriptor for levels of tissue glucose metabolism and changes in response to therapy if technical quality control measures are carefully maintained. The terms partial response, complete response, and progressive disease are best used in clinical trials in which the terms have specific meanings and precise definitions. In clinical practice, it may be better to use descriptive terminology agreed upon by imaging physicians and clinicians in their own practice. Copyright © 2010. Published by Elsevier Inc.

  17. Computer Security Incident Response Team Effectiveness: A Needs Assessment

    Directory of Open Access Journals (Sweden)

    Rick Van der Kleij

    2017-12-01

    Full Text Available Computer security incident response teams (CSIRTs respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  18. Splitting method for computing coupled hydrodynamic and structural response

    International Nuclear Information System (INIS)

    Ash, J.E.

    1977-01-01

    A numerical method is developed for application to unsteady fluid dynamics problems, in particular to the mechanics following a sudden release of high energy. Solution of the initial compressible flow phase provides input to a power-series method for the incompressible fluid motions. The system is split into spatial and time domains leading to the convergent computation of a sequence of elliptic equations. Two sample problems are solved, the first involving an underwater explosion and the second the response of a nuclear reactor containment shell structure to a hypothetical core accident. The solutions are correlated with experimental data

  19. A discrete ordinate response matrix method for massively parallel computers

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1991-01-01

    A discrete ordinate response matrix method is formulated for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices which result from the diamond-differenced equations are utilized in a factored form which minimizes memory requirements and significantly reduces the required number of algorithm utilizes massive parallelism by assigning each spatial node to a processor. The algorithm is accelerated effectively by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red/black iterations. The method has been implemented on a 16k Connection Machine-2, and S 8 and S 16 solutions have been obtained for fixed-source benchmark problems in X--Y geometry

  20. Seismic response computations for a long span bridge

    International Nuclear Information System (INIS)

    McCallen, D.B.

    1994-01-01

    The authors are performing large-scale numerical computations to simulate the earthquake response of a major long-span bridge that crosses the San Francisco Bay. The overall objective of the study is to estimate the response of the bridge to potential large-magnitude earthquakes generated on the nearby San Andreas and Hayward earthquake faults. Generation of a realistic model of the bridge system is complicated by the existence of large pile group foundations that extend deep into soft, saturated clay soils, and by the numerous expansion joints that segment the overall bridge structure. In the current study, advanced, nonlinear, finite element technology is being applied to rigorously model the detailed behavior of the bridge system and to shed light on the influence of the foundations and joints of the bridge

  1. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  2. Computational models for predicting drug responses in cancer research.

    Science.gov (United States)

    Azuaje, Francisco

    2017-09-01

    The computational prediction of drug responses based on the analysis of multiple types of genome-wide molecular data is vital for accomplishing the promise of precision medicine in oncology. This will benefit cancer patients by matching their tumor characteristics to the most effective therapy available. As larger and more diverse layers of patient-related data become available, further demands for new bioinformatics approaches and expertise will arise. This article reviews key strategies, resources and techniques for the prediction of drug sensitivity in cell lines and patient-derived samples. It discusses major advances and challenges associated with the different model development steps. This review highlights major trends in this area, and will assist researchers in the assessment of recent progress and in the selection of approaches to emerging applications in oncology. © The Author 2016. Published by Oxford University Press.

  3. On computing the geoelastic response to a disk load

    Science.gov (United States)

    Bevis, M.; Melini, D.; Spada, G.

    2016-06-01

    We review the theory of the Earth's elastic and gravitational response to a surface disk load. The solutions for displacement of the surface and the geoid are developed using expansions of Legendre polynomials, their derivatives and the load Love numbers. We provide a MATLAB function called diskload that computes the solutions for both uncompensated and compensated disk loads. In order to numerically implement the Legendre expansions, it is necessary to choose a harmonic degree, nmax, at which to truncate the series used to construct the solutions. We present a rule of thumb (ROT) for choosing an appropriate value of nmax, describe the consequences of truncating the expansions prematurely and provide a means to judiciously violate the ROT when that becomes a practical necessity.

  4. Computing emotion awareness through galvanic skin response and facial electromyography

    NARCIS (Netherlands)

    Westerink, Joyce H.D.M.; van den Broek, Egon; Schut, Marleen H.; van Herk, Jan; Tuinenbreijer, Kees; Westerink, Joyce H.D.M.; Ouwerkerk, Martin; Overbeek, Thérése; Pasveer, W. Frank; de Ruyter, Boris

    2008-01-01

    To improve human-computer interaction (HCI), computers need to recognize and respond properly to their user’s emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes

  5. Effective Response to Attacks On Department of Defense Computer Networks

    National Research Council Canada - National Science Library

    Shaha, Patrick

    2001-01-01

    .... For the Commanders-in-Chief (CINCs), computer networking has proven especially useful in maintaining contact and sharing data with elements forward deployed as well as with host nation governments and agencies...

  6. Low-complexity computer simulation of multichannel room impulse responses

    NARCIS (Netherlands)

    Martínez Castañeda, J.A.

    2013-01-01

    The "telephone'' model has been, for the last one hundred thirty years, the base of modern telecommunications with virtually no changes in its fundamental concept. The arise of smaller and more powerful computing devices have opened new possibilities. For example, to build systems able to give to

  7. Understanding Computation of Impulse Response in Microwave Software Tools

    Science.gov (United States)

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  8. Peer-Allocated Instant Response (PAIR): Computional allocation of peer tutors in learning communities

    NARCIS (Netherlands)

    Westera, Wim

    2009-01-01

    Westera, W. (2007). Peer-Allocated Instant Response (PAIR): Computational allocation of peer tutors in learning communities. Journal of Artificial Societies and Social Simulation, http://jasss.soc.surrey.ac.uk/10/2/5.html

  9. The Potential of Older Adults for Response to Computer-Assisted Instruction.

    Science.gov (United States)

    Flynn, Marilyn L.

    1989-01-01

    Describes study that examined the general patterns of computer use and response to computer-assisted instruction by older adults age 45 to 70 in programs for jobless men and women to learn skills for labor market re-entry. Socioeconomic characteristics are examined, and the instructor's role in determining student attitudes is explored. (23…

  10. COMPUTATIONAL MODELING OF SIGNALING PATHWAYS MEDIATING CELL CYCLE AND APOPTOTIC RESPONSES TO IONIZING RADIATION MEDIATED DNA DAMAGE

    Science.gov (United States)

    Demonstrated of the use of a computational systems biology approach to model dose response relationships. Also discussed how the biologically motivated dose response models have only limited reference to the underlying molecular level. Discussed the integration of Computational S...

  11. GATO Code Modification to Compute Plasma Response to External Perturbations

    Science.gov (United States)

    Turnbull, A. D.; Chu, M. S.; Ng, E.; Li, X. S.; James, A.

    2006-10-01

    It has become increasingly clear that the plasma response to an external nonaxiymmetric magnetic perturbation cannot be neglected in many situations of interest. This response can be described as a linear combination of the eigenmodes of the ideal MHD operator. The eigenmodes of the system can be obtained numerically with the GATO ideal MHD stability code, which has been modified for this purpose. A key requirement is the removal of inadmissible continuum modes. For Finite Hybrid Element codes such as GATO, a prerequisite for this is their numerical restabilization by addition of small numerical terms to δ,to cancel the analytic numerical destabilization. In addition, robustness of the code was improved and the solution method speeded up by use of the SuperLU package to facilitate calculation of the full set of eigenmodes in a reasonable time. To treat resonant plasma responses, the finite element basis has been extended to include eigenfunctions with finite jumps at rational surfaces. Some preliminary numerical results for DIII-D equilibria will be given.

  12. A Computational Model of Cellular Response to Modulated Radiation Fields

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, Stephen J., E-mail: stephen.mcmahon@qub.ac.uk [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Butterworth, Karl T. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); McGarry, Conor K. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Trainor, Colman [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); O' Sullivan, Joe M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Clinical Oncology, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Belfast, Northern Ireland (United Kingdom); Hounsell, Alan R. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Prise, Kevin M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom)

    2012-09-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  13. A Computational Model of Cellular Response to Modulated Radiation Fields

    International Nuclear Information System (INIS)

    McMahon, Stephen J.; Butterworth, Karl T.; McGarry, Conor K.; Trainor, Colman; O’Sullivan, Joe M.; Hounsell, Alan R.; Prise, Kevin M.

    2012-01-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  14. Computational modeling of cardiovascular response to orthostatic stress

    Science.gov (United States)

    Heldt, Thomas; Shim, Eun B.; Kamm, Roger D.; Mark, Roger G.

    2002-01-01

    The objective of this study is to develop a model of the cardiovascular system capable of simulating the short-term (responses to head-up tilt and lower body negative pressure. The model consists of a closed-loop lumped-parameter representation of the circulation connected to set-point models of the arterial and cardiopulmonary baroreflexes. Model parameters are largely based on literature values. Model verification was performed by comparing the simulation output under baseline conditions and at different levels of orthostatic stress to sets of population-averaged hemodynamic data reported in the literature. On the basis of experimental evidence, we adjusted some model parameters to simulate experimental data. Orthostatic stress simulations are not statistically different from experimental data (two-sided test of significance with Bonferroni adjustment for multiple comparisons). Transient response characteristics of heart rate to tilt also compare well with reported data. A case study is presented on how the model is intended to be used in the future to investigate the effects of post-spaceflight orthostatic intolerance.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  16. Computation of Schenberg response function by using finite element modelling

    International Nuclear Information System (INIS)

    Frajuca, C; Bortoli, F S; Magalhaes, N S

    2016-01-01

    Schenberg is a detector of gravitational waves resonant mass type, with a central frequency of operation of 3200 Hz. Transducers located on the surface of the resonating sphere, according to a distribution half-dodecahedron, are used to monitor a strain amplitude. The development of mechanical impedance matchers that act by increasing the coupling of the transducers with the sphere is a major challenge because of the high frequency and small in size. The objective of this work is to study the Schenberg response function obtained by finite element modeling (FEM). Finnaly, the result is compared with the result of the simplified model for mass spring type system modeling verifying if that is suitable for the determination of sensitivity detector, as the conclusion the both modeling give the same results. (paper)

  17. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  18. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  19. Evaluating and tuning system response in the MFTF-B control and diagnostics computers

    International Nuclear Information System (INIS)

    Palasek, R.L.; Butner, D.N.; Minor, E.G.

    1983-01-01

    The software system running on the Supervisory Control and Diagnostics System (SCDS) of MFTF-B is, for the major part, an event driven one. Regular, periodic polling of sensors' outputs takes place only at the local level, in the sensors' corresponding local control microcomputers (LCC's). An LCC reports a sensor's value to the supervisory computer only if there was a significant change. This report is passed as a message, routed among and acted upon by a network of applications and systems tasks within the supervisory computer (SCDS). Commands from the operator's console are similarly routed through a network of tasks, but in the oppostie direction to the experiment's hardware. In a network such as this, response time is partialy determined by system traffic. Because the hardware of MFTF-B will not be connected to the computer system for another two years, we are using the local control computers to simulate the event driven traffic that we expect to see during MFTF-B operation. In this paper we show how we are using the simulator to measure and evaluate response, loading, throughput, and utilization of components within the computer system. Measurement of the system under simulation allows us to identify bottlenecks and verify their unloosening. We also use the traffic simulators to evaluate prototypes of different algorithms for selected tasks, comparing their responses under the spectrum of traffic intensities

  20. Children's Responses to Computer-Synthesized Speech in Educational Media: Gender Consistency and Gender Similarity Effects

    Science.gov (United States)

    Lee, Kwan Min; Liao, Katharine; Ryu, Seoungho

    2007-01-01

    This study examines children's social responses to gender cues in synthesized speech in a computer-based instruction setting. Eighty 5th-grade elementary school children were randomly assigned to one of the conditions in a full-factorial 2 (participant gender) x 2 (voice gender) x 2 (content gender) experiment. Results show that children apply…

  1. Ark of Inquiry: Responsible Research and Innovation through Computer-Based Inquiry Learning

    NARCIS (Netherlands)

    Margus Pedaste; Leo Siiman; Bregje de Vries; Mirjam Burget; Tomi Jaakkola; Emanuele Bardone; Meelis Brikker; Mario Mäeots; Marianne Lind; Koen Veermans

    2015-01-01

    Ark of Inquiry is a learning platform that uses a computer-based inquiry learning approach to raise youth awareness to Responsible Research and Innovation (RRI). It is developed in the context of a large-scale European project (http://www.arkofinquiry.eu) and provides young European citizens

  2. Implementation of distributed computing system for emergency response and contaminant spill monitoring

    International Nuclear Information System (INIS)

    Ojo, T.O.; Sterling, M.C.Jr.; Bonner, J.S.; Fuller, C.B.; Kelly, F.; Page, C.A.

    2003-01-01

    The availability and use of real-time environmental data greatly enhances emergency response and spill monitoring in coastal and near shore environments. The data would include surface currents, wind speed, wind direction, and temperature. Model predictions (fate and transport) or forensics can also be included. In order to achieve an integrated system suitable for application in spill or emergency response situations, a link is required because this information exists on many different computing platforms. When real-time measurements are needed to monitor a spill, the use of a wide array of sensors and ship-based post-processing methods help reduce the latency in data transfer between field sampling stations and the Incident Command Centre. The common thread linking all these modules is the Transmission Control Protocol/Internet Protocol (TCP/IP), and the result is an integrated distributed computing system (DCS). The in-situ sensors are linked to an onboard computer through the use of a ship-based local area network (LAN) using a submersible device server. The onboard computer serves as both the data post-processor and communications server. It links the field sampling station with other modules, and is responsible for transferring data to the Incident Command Centre. This link is facilitated by a wide area network (WAN) based on wireless broadband communications facilities. This paper described the implementation of the DCS. The test results for the communications link and system readiness were also included. 6 refs., 2 tabs., 3 figs

  3. Psychophysiological Assessment Of Fear Experience In Response To Sound During Computer Video Gameplay

    DEFF Research Database (Denmark)

    Garner, Tom Alexander; Grimshaw, Mark

    2013-01-01

    The potential value of a looping biometric feedback system as a key component of adaptive computer video games is significant. Psychophysiological measures are essential to the development of an automated emotion recognition program, capable of interpreting physiological data into models of affect...... and systematically altering the game environment in response. This article presents empirical data the analysis of which advocates electrodermal activity and electromyography as suitable physiological measures to work effectively within a computer video game-based biometric feedback loop, within which sound...

  4. Computed tomography assessment of early response to neoadjuvant therapy in colon cancer

    DEFF Research Database (Denmark)

    Dam, Claus; Lund-Rasmussen, Vera; Pløen, John

    2015-01-01

    INTRODUCTION: Using multidetector computed tomography, we aimed to assess the early response of neoadjuvant drug therapy for locally advanced colon cancer. METHODS: Computed tomography with IV contrast was acquired from 67 patients before and after up to three cycles of preoperative treatment. All...... patients had histologically confirmed colon cancer, a T4 or T3 tumour with extramural invasion ≥ 5 mm and no distant metastases or peritoneal nodules. The patients were treated with oxaliplatin and capecitabine. In addition, those with no mutations in the KRAS, BRAF and PIK3CA genes were also treated...

  5. Computational Fluid Dynamics Simulation of Combustion Instability in Solid Rocket Motor : Implementation of Pressure Coupled Response Function

    OpenAIRE

    S. Saha; D. Chakraborty

    2016-01-01

    Combustion instability in solid propellant rocket motor is numerically simulated by implementing propellant response function with quasi steady homogeneous one dimensional formulation. The convolution integral of propellant response with pressure history is implemented through a user defined function in commercial computational fluid dynamics software. The methodology is validated against literature reported motor test and other simulation results. Computed amplitude of pressure fluctuations ...

  6. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    Science.gov (United States)

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  7. Towards an integrative computational model for simulating tumor growth and response to radiation therapy

    Science.gov (United States)

    Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar

    2017-11-01

    Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.

  8. Predictive computational modeling of the mucosal immune responses during Helicobacter pylori infection.

    Directory of Open Access Journals (Sweden)

    Adria Carbo

    Full Text Available T helper (Th cells play a major role in the immune response and pathology at the gastric mucosa during Helicobacter pylori infection. There is a limited mechanistic understanding regarding the contributions of CD4+ T cell subsets to gastritis development during H. pylori colonization. We used two computational approaches: ordinary differential equation (ODE-based and agent-based modeling (ABM to study the mechanisms underlying cellular immune responses to H. pylori and how CD4+ T cell subsets influenced initiation, progression and outcome of disease. To calibrate the model, in vivo experimentation was performed by infecting C57BL/6 mice intragastrically with H. pylori and assaying immune cell subsets in the stomach and gastric lymph nodes (GLN on days 0, 7, 14, 30 and 60 post-infection. Our computational model reproduced the dynamics of effector and regulatory pathways in the gastric lamina propria (LP in silico. Simulation results show the induction of a Th17 response and a dominant Th1 response, together with a regulatory response characterized by high levels of mucosal Treg cells. We also investigated the potential role of peroxisome proliferator-activated receptor γ (PPARγ activation on the modulation of host responses to H. pylori by using loss-of-function approaches. Specifically, in silico results showed a predominance of Th1 and Th17 cells in the stomach of the cell-specific PPARγ knockout system when compared to the wild-type simulation. Spatio-temporal, object-oriented ABM approaches suggested similar dynamics in induction of host responses showing analogous T cell distributions to ODE modeling and facilitated tracking lesion formation. In addition, sensitivity analysis predicted a crucial contribution of Th1 and Th17 effector responses as mediators of histopathological changes in the gastric mucosa during chronic stages of infection, which were experimentally validated in mice. These integrated immunoinformatics approaches

  9. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  13. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin, E-mail: Bin.Zheng-1@ou.edu [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Hollingsworth, Alan B. [Mercy Women’s Center, Mercy Health Center, Oklahoma City, Oklahoma 73120 (United States); Qian, Wei [Department of Electrical and Computer Engineering, University of Texas, El Paso, Texas 79968 (United States)

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. New computational method for non-LTE, the linear response matrix

    International Nuclear Information System (INIS)

    Fournier, K.B.; Grasiani, F.R.; Harte, J.A.; Libby, S.B.; More, R.M.; Zimmerman, G.B.

    1998-01-01

    My coauthors have done extensive theoretical and computational calculations that lay the ground work for a linear response matrix method to calculate non-LTE (local thermodynamic equilibrium) opacities. I will give briefly review some of their work and list references. Then I will describe what has been done to utilize this theory to create a computational package to rapidly calculate mild non-LTE emission and absorption opacities suitable for use in hydrodynamic calculations. The opacities are obtained by performing table look-ups on data that has been generated with a non-LTE package. This scheme is currently under development. We can see that it offers a significant computational speed advantage. It is suitable for mild non-LTE, quasi-steady conditions. And it offers a new insertion path for high-quality non-LTE data. Currently, the linear response matrix data file is created using XSN. These data files could be generated by more detailed and rigorous calculations without changing any part of the implementation in the hydro code. The scheme is running in Lasnex and is being tested and developed

  16. Computational Experiments on the Step and Frequency Responses of a Three-Axis Thermal Accelerometer

    Directory of Open Access Journals (Sweden)

    Yoshifumi Ogami

    2017-11-01

    Full Text Available The sensor response has been reported to become highly nonlinear when the acceleration added to a thermal accelerator is very large, so the same response can be observed for two accelerations with different magnitudes and opposite signs. Some papers have reported the frequency response for the horizontal acceleration to be a first-order system, while others have reported it to be a second-order system. The response for the vertical acceleration has not been studied. In this study, computational experiments were performed to examine the step and frequency responses of a three-axis thermal accelerometer. The results showed that monitoring the temperatures at two positions and making use of cross-axis sensitivity allow a unique acceleration to be determined even when the range of the vertical acceleration is very large (e.g., −10,000–10,000 g. The frequency response was proven to be a second-order system for horizontal acceleration and a third-order system for vertical acceleration.

  17. Computational tools for fitting the Hill equation to dose-response curves.

    Science.gov (United States)

    Gadagkar, Sudhindra R; Call, Gerald B

    2015-01-01

    Many biological response curves commonly assume a sigmoidal shape that can be approximated well by means of the 4-parameter nonlinear logistic equation, also called the Hill equation. However, estimation of the Hill equation parameters requires access to commercial software or the ability to write computer code. Here we present two user-friendly and freely available computer programs to fit the Hill equation - a Solver-based Microsoft Excel template and a stand-alone GUI-based "point and click" program, called HEPB. Both computer programs use the iterative method to estimate two of the Hill equation parameters (EC50 and the Hill slope), while constraining the values of the other two parameters (the minimum and maximum asymptotes of the response variable) to fit the Hill equation to the data. In addition, HEPB draws the prediction band at a user-defined confidence level, and determines the EC50 value for each of the limits of this band to give boundary values that help objectively delineate sensitive, normal and resistant responses to the drug being tested. Both programs were tested by analyzing twelve datasets that varied widely in data values, sample size and slope, and were found to yield estimates of the Hill equation parameters that were essentially identical to those provided by commercial software such as GraphPad Prism and nls, the statistical package in the programming language R. The Excel template provides a means to estimate the parameters of the Hill equation and plot the regression line in a familiar Microsoft Office environment. HEPB, in addition to providing the above results, also computes the prediction band for the data at a user-defined level of confidence, and determines objective cut-off values to distinguish among response types (sensitive, normal and resistant). Both programs are found to yield estimated values that are essentially the same as those from standard software such as GraphPad Prism and the R-based nls. Furthermore, HEPB also has

  18. Computed tomography assessment of early response to neoadjuvant therapy in colon cancer

    DEFF Research Database (Denmark)

    Dam, Claus; Lund-Rasmussen, Vera; Pløen, John

    2015-01-01

    INTRODUCTION: Using multidetector computed tomography, we aimed to assess the early response of neoadjuvant drug therapy for locally advanced colon cancer. METHODS: Computed tomography with IV contrast was acquired from 67 patients before and after up to three cycles of preoperative treatment. All...... patients had histologically confirmed colon cancer, a T4 or T3 tumour with extramural invasion ≥ 5 mm and no distant metastases or peritoneal nodules. The patients were treated with oxaliplatin and capecitabine. In addition, those with no mutations in the KRAS, BRAF and PIK3CA genes were also treated...... with panitumumab. Before and after treatment, we measured the tumour diameter in two different planes, the extension of the extramural tumour invasion, and the number and size of enlarged lymph nodes. RESULTS: The mean tumour length was 7.8 cm (95% confidence interval (CI): 5.3-10.4) at baseline and 4.34 cm (95...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  1. Computational Analysis of Single Nucleotide Polymorphisms Associated with Altered Drug Responsiveness in Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Valerio Costa

    2016-06-01

    Full Text Available Type 2 diabetes (T2D is one of the most frequent mortality causes in western countries, with rapidly increasing prevalence. Anti-diabetic drugs are the first therapeutic approach, although many patients develop drug resistance. Most drug responsiveness variability can be explained by genetic causes. Inter-individual variability is principally due to single nucleotide polymorphisms, and differential drug responsiveness has been correlated to alteration in genes involved in drug metabolism (CYP2C9 or insulin signaling (IRS1, ABCC8, KCNJ11 and PPARG. However, most genome-wide association studies did not provide clues about the contribution of DNA variations to impaired drug responsiveness. Thus, characterizing T2D drug responsiveness variants is needed to guide clinicians toward tailored therapeutic approaches. Here, we extensively investigated polymorphisms associated with altered drug response in T2D, predicting their effects in silico. Combining different computational approaches, we focused on the expression pattern of genes correlated to drug resistance and inferred evolutionary conservation of polymorphic residues, computationally predicting the biochemical properties of polymorphic proteins. Using RNA-Sequencing followed by targeted validation, we identified and experimentally confirmed that two nucleotide variations in the CAPN10 gene—currently annotated as intronic—fall within two new transcripts in this locus. Additionally, we found that a Single Nucleotide Polymorphism (SNP, currently reported as intergenic, maps to the intron of a new transcript, harboring CAPN10 and GPR35 genes, which undergoes non-sense mediated decay. Finally, we analyzed variants that fall into non-coding regulatory regions of yet underestimated functional significance, predicting that some of them can potentially affect gene expression and/or post-transcriptional regulation of mRNAs affecting the splicing.

  2. Spectral response model for a multibin photon-counting spectral computed tomography detector and its applications.

    Science.gov (United States)

    Liu, Xuejin; Persson, Mats; Bornefalk, Hans; Karlsson, Staffan; Xu, Cheng; Danielsson, Mats; Huber, Ben

    2015-07-01

    Variations among detector channels in computed tomography can lead to ring artifacts in the reconstructed images and biased estimates in projection-based material decomposition. Typically, the ring artifacts are corrected by compensation methods based on flat fielding, where transmission measurements are required for a number of material-thickness combinations. Phantoms used in these methods can be rather complex and require an extensive number of transmission measurements. Moreover, material decomposition needs knowledge of the individual response of each detector channel to account for the detector inhomogeneities. For this purpose, we have developed a spectral response model that binwise predicts the response of a multibin photon-counting detector individually for each detector channel. The spectral response model is performed in two steps. The first step employs a forward model to predict the expected numbers of photon counts, taking into account parameters such as the incident x-ray spectrum, absorption efficiency, and energy response of the detector. The second step utilizes a limited number of transmission measurements with a set of flat slabs of two absorber materials to fine-tune the model predictions, resulting in a good correspondence with the physical measurements. To verify the response model, we apply the model in two cases. First, the model is used in combination with a compensation method which requires an extensive number of transmission measurements to determine the necessary parameters. Our spectral response model successfully replaces these measurements by simulations, saving a significant amount of measurement time. Second, the spectral response model is used as the basis of the maximum likelihood approach for projection-based material decomposition. The reconstructed basis images show a good separation between the calcium-like material and the contrast agents, iodine and gadolinium. The contrast agent concentrations are reconstructed with more

  3. Behavioral response of tilapia (Oreochromis niloticus) to acute ammonia stress monitored by computer vision.

    Science.gov (United States)

    Xu, Jian-yu; Miao, Xiang-wen; Liu, Ying; Cui, Shao-rong

    2005-08-01

    The behavioral responses of a tilapia (Oreochromis niloticus) school to low (0.13 mg/L), moderate (0.79 mg/L) and high (2.65 mg/L) levels of unionized ammonia (UIA) concentration were monitored using a computer vision system. The swimming activity and geometrical parameters such as location of the gravity center and distribution of the fish school were calculated continuously. These behavioral parameters of tilapia school responded sensitively to moderate and high UIA concentration. Under high UIA concentration the fish activity showed a significant increase (Pfish behavior under acute stress can provide important information useful in predicting the stress.

  4. Role of computed tomography imaging in predicting response of nasopharyngeal carcinoma to definitive radiation therapy.

    Science.gov (United States)

    Ma, Xuejun; Lu, Jiade Jay; Loh, Kwok Seng; Shakespeare, Thomas P; Thiagarajan, Anu; Goh, Boon Cher; Tan, Kim Siang Luke

    2006-12-01

    The purpose of this study was to investigate the role of posttreatment computed tomography (CT) scans in assessing response of nasopharyngeal carcinoma (NPC) to definitive radiotherapy. Between March 1999 and October 2003, a total of 132 consecutive patients with newly diagnosed NPC were studied. Sixty-one patients with AJCC stage I or II NPC were treated with radiation only; 71 patients with stage III or IV disease but no evidence of distant metastasis were treated with concurrent chemoradiotherapy. All patients received CT scans of the head and neck, nasopharyngoscopy, and biopsies of primary sites at 4 to 6 months after completion of radiotherapy. Clinical response of the primary tumor as determined by comparison of pre- and posttreatment CT scans was correlated to pathology results. The median follow-up time for all patients was 25 months (range, 9-40 months). Radiologic progression was seen in five patients, stable disease in 18 patients, and radiographic partial (rPR) and complete responses (rCR) were seen in 67 and 42 patients, respectively, at 4 to 6 months of follow up. Biopsies of the nasopharynx were positive in six patients. For patients with rCR, two patients (4.8%) had positive biopsies. Four patients with residual disease (rPR, stable, or progressive disease) after treatment had positive biopsies. The positive and negative predictive values, sensitivity, and specificity of CT scans in evaluating the NPC response to radiotherapy were 0.04, 0.95, 0.67, and 0.32, respectively. Pathologic CR for nasopharyngeal carcinoma is usually evident at 4 to 6 months after definitive radiotherapy; however, there is no correlation between pathologic and radiographic response. Although longer follow up is required to define the relationship between radiographic and pathologic responses with respect to disease control, we find CT scan at 4 to 6 months after radiotherapy to be neither sensitive nor specific in predicting the response of primary NPC to radiotherapy.

  5. Computational Model and Numerical Simulation for Submerged Mooring Monitoring Platform’s Dynamical Response

    Directory of Open Access Journals (Sweden)

    He Kongde

    2015-01-01

    Full Text Available Computational model and numerical simulation for submerged mooring monitoring platform were formulated aimed at the dynamical response by the action of flow force, which based on Hopkinson impact load theory, taken into account the catenoid effect of mooring cable and revised the difference of tension and tangential direction action force by equivalent modulus of elasticity. Solved the equation by hydraulics theory and structural mechanics theory of oceaneering, studied the response of buoy on flow force. The validity of model were checked and the results were in good agreement; the result show the buoy will engender biggish heave and swaying displacement, but the swaying displacement got stable quickly and the heaven displacement cause vibration for the vortex-induced action by the flow.

  6. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    Science.gov (United States)

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell

  13. Experimental and computational analysis of pressure response in a multiphase flow loop

    Science.gov (United States)

    Morshed, Munzarin; Amin, Al; Rahman, Mohammad Azizur; Imtiaz, Syed

    2016-07-01

    The characteristics of multiphase fluid flow in pipes are useful to understand fluid mechanics encountered in the oil and gas industries. In the present day oil and gas exploration is successively inducing subsea operation in the deep sea and arctic condition. During the transport of petroleum products, understanding the fluid dynamics inside the pipe network is important for flow assurance. In this case the information regarding static and dynamic pressure response, pressure loss, optimum flow rate, pipe diameter etc. are the important parameter for flow assurance. The principal aim of this research is to represents computational analysis and experimental analysis of multi-phase (L/G) in a pipe network. This computational study considers a two-phase fluid flow through a horizontal flow loop with at different Reynolds number in order to determine the pressure distribution, frictional pressure loss profiles by volume of fluid (VOF) method. However, numerical simulations are validated with the experimental data. The experiment is conducted in 76.20 mm ID transparent circular pipe using water and air in the flow loop. Static pressure transducers are used to measure local pressure response in multiphase pipeline.

  14. Computational biomechanics of bone's responses to dental prostheses - osseointegration, remodeling and resorption

    Science.gov (United States)

    Li, Wei; Rungsiyakull, Chaiy; Field, Clarice; Lin, Daniel; Zhang, Leo; Li, Qing; Swain, Michael

    2010-06-01

    Clinical and experimental studies showed that human bone has the ability to remodel itself to better adapt to its biomechanical environment by changing both its material properties and geometry. As a consequence of the rapid development and extensive applications of major dental restorations such as implantation and fixed partial denture (FPD), the effect of bone remodeling on the success of a dental restorative surgery is becoming critical for prosthetic design and pre-surgical assessment. This paper aims to provide a computational biomechanics framework to address dental bone's responses as a result of dental restoration. It explored three important issues of resorption, apposition and osseointegration in terms of remodeling simulation. The published remodeling data in long bones were regulated to drive the computational remodeling prediction for the dental bones by correlating the results to clinical data. It is anticipated that the study will provide a more predictive model of dental bone response and help develop a new design methodology for patient-specific dental prosthetic restoration.

  15. Computational biomechanics of bone's responses to dental prostheses - osseointegration, remodeling and resorption

    International Nuclear Information System (INIS)

    Li Wei; Rungsiyakull, Chaiy; Field, Clarice; Lin, Daniel; Zhang Leo; Li Qing; Swain, Michael

    2010-01-01

    Clinical and experimental studies showed that human bone has the ability to remodel itself to better adapt to its biomechanical environment by changing both its material properties and geometry. As a consequence of the rapid development and extensive applications of major dental restorations such as implantation and fixed partial denture (FPD), the effect of bone remodeling on the success of a dental restorative surgery is becoming critical for prosthetic design and pre-surgical assessment. This paper aims to provide a computational biomechanics framework to address dental bone's responses as a result of dental restoration. It explored three important issues of resorption, apposition and osseointegration in terms of remodeling simulation. The published remodeling data in long bones were regulated to drive the computational remodeling prediction for the dental bones by correlating the results to clinical data. It is anticipated that the study will provide a more predictive model of dental bone response and help develop a new design methodology for patient-specific dental prosthetic restoration.

  16. A three-dimensional computer code for the nonlinear dynamic response of an HTGR core

    International Nuclear Information System (INIS)

    Subudhi, M.; Lasker, L.; Koplik, B.; Curreri, J.; Goradia, H.

    1979-01-01

    A three-dimensional dynamic code has been developed to determine the nonlinear response of an HTGR core. The HTGR core consists of several thousands of hexagonal core blocks. These are arranged in layers stacked together. Each layer contains many core blocks surrounded on their outer periphery by reflector blocks. The entire assembly is contained within a prestressed concrete reactor vessel. Gaps exist between adjacent blocks in any horizontal plane. Each core block in a given layer is connected to the blocks directly above and below it via three dowell pins. The present analytical study is directed towards an investigation of the nonlinear response of the reactor core blocks in the event of a seismic occurrence. The computer code is developed for a specific mathematical model which represents a vertical arrangement of layers of blocks. This comprises a 'block module' of core elements which would be obtained by cutting a cylindrical portion consisting of seven fuel blocks per layer. It is anticipated that a number of such modules properly arranged could represent the entire core. Hence, the predicted response of this module would exhibit the response characteristics of the core. (orig.)

  17. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  18. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  2. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  3. Dose-response relationships using brain-computer interface technology impact stroke rehabilitation.

    Science.gov (United States)

    Young, Brittany M; Nigogosyan, Zack; Walton, Léo M; Remsik, Alexander; Song, Jie; Nair, Veena A; Tyler, Mitchell E; Edwards, Dorothy F; Caldera, Kristin; Sattin, Justin A; Williams, Justin C; Prabhakaran, Vivek

    2015-01-01

    Brain-computer interfaces (BCIs) are an emerging novel technology for stroke rehabilitation. Little is known about how dose-response relationships for BCI therapies affect brain and behavior changes. We report preliminary results on stroke patients (n = 16, 11 M) with persistent upper extremity motor impairment who received therapy using a BCI system with functional electrical stimulation of the hand and tongue stimulation. We collected MRI scans and behavioral data using the Action Research Arm Test (ARAT), 9-Hole Peg Test (9-HPT), and Stroke Impact Scale (SIS) before, during, and after the therapy period. Using anatomical and functional MRI, we computed Laterality Index (LI) for brain activity in the motor network during impaired hand finger tapping. Changes from baseline LI and behavioral scores were assessed for relationships with dose, intensity, and frequency of BCI therapy. We found that gains in SIS Strength were directly responsive to BCI therapy: therapy dose and intensity correlated positively with increased SIS Strength (p ≤ 0.05), although no direct relationships were identified with ARAT or 9-HPT scores. We found behavioral measures that were not directly sensitive to differences in BCI therapy administration but were associated with concurrent brain changes correlated with BCI therapy administration parameters: therapy dose and intensity showed significant (p ≤ 0.05) or trending (0.05 < p < 0.1) negative correlations with LI changes, while therapy frequency did not affect LI. Reductions in LI were then correlated (p ≤ 0.05) with increased SIS Activities of Daily Living scores and improved 9-HPT performance. Therefore, some behavioral changes may be reflected by brain changes sensitive to differences in BCI therapy administration, while others such as SIS Strength may be directly responsive to BCI therapy administration. Data preliminarily suggest that when using BCI in stroke rehabilitation, therapy frequency may be less important than dose and

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  5. Quantitative evaluation of photic driving response for computer-aided diagnosis

    Science.gov (United States)

    Fukami, Tadanori; Ishikawa, Fumito; Ishikawa, Bunnoshin; Saito, Yoichi

    2008-12-01

    The aim of our research is the quantification of the photic driving response, a routine electroencephalogram (EEG) examination, for computer-aided diagnosis. It is well known that the EEG responds not only to the fundamental frequency but also to all sub and higher harmonics of a stimulus. In this study, we propose a method for detecting and evaluating responses in screening data for individuals. This method consists of two comparisons based on statistical tests. One is an intraindividual comparison between the EEG at rest and the photic stimulation (PS) response reflecting enhancement and suppression by PS, and the other is a comparison between data from an individual and a distribution of normals reflecting the position of the individual's data in the distribution of normals in the normal database. These tests were evaluated using the Z-value based on the Mann-Whitney U-test. We measured EEGs from 130 normal subjects and 30 patients with any of schizophrenia, dementia and epilepsy. Normal data were divided into two groups, the first consisting of 100 data for database construction and the second of 30 data for test data. Using our method, a prominent statistical peak of the Z-value was recognized even if the harmonics and alpha band overlapped. Moreover, we found a statistical difference between patients and the normal database at diagnostically helpful frequencies such as subharmonics, the fundamental wave, higher harmonics and the alpha frequency band.

  6. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  7. Topographical changes in photo-responsive liquid crystal films: a computational analysis.

    Science.gov (United States)

    Liu, Ling; Onck, Patrick R

    2018-03-28

    Switchable materials in response to external stimuli serve as building blocks to construct microscale functionalized actuators and sensors. Azobenzene-modified liquid crystal (LC) polymeric networks, that combine liquid crystalline orientational order and elasticity, reversibly undergo conformational changes powered by light. We present a computational framework to describe photo-induced topographical transformations of azobenzene-modified LC glassy polymer coatings. A nonlinear light penetration model is combined with an opto-mechanical constitutive relation to simulate various ordered and corrugated topographical textures resulting from aligned or randomly distributed LC molecule orientations. Our results shed light on the fundamental physical mechanisms of light-triggered surface undulations and can be used as guidelines to optimize surface modulation and roughness in emerging fields that involve haptics interfacing, friction control and wetting manipulation.

  8. Impulse-response analysis of planar computed tomography for nondestructive test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Cheon; Kim, Seung Ho; Kim, Ho Kyung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There have been reported that the use of radiation imaging such as digital radiography, computed tomography (CT), and digital tomosynthesis (DTS) for the nondestructive test (NDT) widely is spreading. These methods have merits and demerits of their own, in terms of image quality and inspection speed. Therefore, image for these methods for NDT should have acceptable image quality and high speed. In this study, we quantitatively evaluate impulse responses of reconstructed images from the filtered backprojection (FBP), which are most widely used in planar computed tomography (pCT) systems. We first evaluate image performance metrics due to the contrast, depth resolution, and then we design the figure of merit including image performance and system parameters, such as tube load and reconstruction speed. The final goal of this study is the application of these methods to the nondestructive test. In order to accomplish it, further study is needed. First of all, the results of the ASF from various numbers of views. Second, the analysis of modulation transfer function, noise power spectrum, and detective quantum efficiency from various angular range and numbers of views.

  9. Computation of the response functions of spiral waves in active media.

    Science.gov (United States)

    Biktasheva, I V; Barkley, D; Biktashev, V N; Bordyugov, G V; Foulkes, A J

    2009-05-01

    Rotating spiral waves are a form of self-organization observed in spatially extended systems of physical, chemical, and biological natures. A small perturbation causes gradual change in spatial location of spiral's rotation center and frequency, i.e., drift. The response functions (RFs) of a spiral wave are the eigenfunctions of the adjoint linearized operator corresponding to the critical eigenvalues lambda=0,+/-iomega. The RFs describe the spiral's sensitivity to small perturbations in the way that a spiral is insensitive to small perturbations where its RFs are close to zero. The velocity of a spiral's drift is proportional to the convolution of RFs with the perturbation. Here we develop a regular and generic method of computing the RFs of stationary rotating spirals in reaction-diffusion equations. We demonstrate the method on the FitzHugh-Nagumo system and also show convergence of the method with respect to the computational parameters, i.e., discretization steps and size of the medium. The obtained RFs are localized at the spiral's core.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. Brushless DC motor control system responsive to control signals generated by a computer or the like

    Science.gov (United States)

    Packard, D. T.

    1985-04-01

    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The motor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor be regulated by applying a separate control signal and each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  12. Brushless DC motor control system responsive to control signals generated by a computer or the like

    Science.gov (United States)

    Packard, Douglas T. (Inventor); Schmitt, Donald E. (Inventor)

    1987-01-01

    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The rotor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor may be regulated by applying a separate control signal to each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  13. Computational Model of Antidepressant Response Heterogeneity as Multi-pathway Neuroadaptation

    Directory of Open Access Journals (Sweden)

    Mariam B. Camacho

    2017-12-01

    Full Text Available Current hypotheses cannot fully explain the clinically observed heterogeneity in antidepressant response. The therapeutic latency of antidepressants suggests that therapeutic outcomes are achieved not by the acute effects of the drugs, but rather by the homeostatic changes that occur as the brain adapts to their chronic administration. We present a computational model that represents the known interactions between the monoaminergic neurotransmitter-producing brain regions and associated non-monoaminergic neurotransmitter systems, and use the model to explore the possible ways in which the brain can homeostatically adjust to chronic antidepressant administration. The model also represents the neuron-specific neurotransmitter receptors that are known to adjust their strengths (expressions or sensitivities in response to chronic antidepressant administration, and neuroadaptation in the model occurs through sequential adjustments in these receptor strengths. The main result is that the model can reach similar levels of adaptation to chronic administration of the same antidepressant drug or combination along many different pathways, arriving correspondingly at many different receptor strength configurations, but not all of those adapted configurations are also associated with therapeutic elevations in monoamine levels. When expressed as the percentage of adapted configurations that are also associated with elevations in one or more of the monoamines, our modeling results largely agree with the percentage efficacy rates of antidepressants and antidepressant combinations observed in clinical trials. Our neuroadaptation model provides an explanation for the clinical reports of heterogeneous outcomes among patients chronically administered the same antidepressant drug regimen.

  14. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    Directory of Open Access Journals (Sweden)

    James M Leaming

    2013-05-01

    Full Text Available Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations’ ability to assess participants’ beliefs regarding their own hospitals’ mass casualty incident preparedness.Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155 were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees.Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities.Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. [West J Emerg Med. 2013;14(3:236–242.

  15. Computational Models of the Cardiovascular System and Its Response to Microgravity

    Science.gov (United States)

    Kamm, Roger D.

    1999-01-01

    Computational models of the cardiovascular system are powerful adjuncts to ground-based and in-flight experiments. We will provide NSBRI with a model capable of simulating the short-term effects of gravity on cardiovascular function. The model from this project will: (1) provide a rational framework which quantitatively defines interactions among complex cardiovascular parameters and which supports the critical interpretation of experimental results and testing of hypotheses. (2) permit predictions of the impact of specific countermeasures in the context of various hypothetical cardiovascular abnormalities induced by microgravity. Major progress has been made during the first 18 months of the program: (1) We have developed an operational first-order computer model capable of simulating the cardiovascular response to orthostatic stress. The model consists of a lumped parameter hemodynamic model and a complete reflex control system. The latter includes cardiopulmonary and carotid sinus reflex limbs and interactions between the two. (2) We have modeled the physiologic stress of tilt table experiments and lower body negative pressure procedures (LBNP). We have verified our model's predictions by comparing them with experimental findings from the literature. (3) We have established collaborative efforts with leading investigators interested in experimental studies of orthostatic intolerance, cardiovascular control, and physiologic responses to space flight. (4) We have established a standardized method of transferring data to our laboratory from the ongoing NSBRI bedrest studies. We use this data to estimate input parameters to our model and compare our model predictions to actual data to further verify our model. (5) We are in the process of systematically simulating current hypotheses concerning the mechanism underlying orthostatic intolerance by matching our simulations to stand test data from astronauts pre- and post-flight. (6) We are in the process of developing a

  16. Cloud Computing for Science Data Processing in Support of Emergency Response

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing enables users to create virtual computers, each one with the optimal configuration of hardware and software for a job. The number of virtual...

  17. A Multiscale Computational Model of the Response of Swine Epidermis After Acute Irradiation

    Science.gov (United States)

    Hu, Shaowen; Cucinotta, Francis A.

    2012-01-01

    Radiation exposure from Solar Particle Events can lead to very high skin dose for astronauts on exploration missions outside the protection of the Earth s magnetic field [1]. Assessing the detrimental effects to human skin under such adverse conditions could be predicted by conducting territorial experiments on animal models. In this study we apply a computational approach to simulate the experimental data of the radiation response of swine epidermis, which is closely similar to human epidermis [2]. Incorporating experimentally measured histological and cell kinetic parameters into a multiscale tissue modeling framework, we obtain results of population kinetics and proliferation index comparable to unirradiated and acutely irradiated swine experiments [3]. It is noted the basal cell doubling time is 10 to 16 days in the intact population, but drops to 13.6 hr in the regenerating populations surviving irradiation. This complex 30-fold variation is proposed to be attributed to the shortening of the G1 phase duration. We investigate this radiation induced effect by considering at the sub-cellular level the expression and signaling of TGF-beta, as it is recognized as a key regulatory factor of tissue formation and wound healing [4]. This integrated model will allow us to test the validity of various basic biological rules at the cellular level and sub-cellular mechanisms by qualitatively comparing simulation results with published research, and should lead to a fuller understanding of the pathophysiological effects of ionizing radiation on the skin.

  18. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo [Gachon University Gil Hospital, Incheon (Korea, Republic of)

    2016-06-15

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment.

  19. Social Skills Instruction for Urban Learners with Emotional and Behavioral Disorders: A Culturally Responsive and Computer-Based Intervention

    Science.gov (United States)

    Robinson-Ervin, Porsha; Cartledge, Gwendolyn; Musti-Rao, Shobana; Gibson, Lenwood, Jr.; Keyes, Starr E.

    2016-01-01

    This study examined the effects of culturally relevant/responsive, computer-based social skills instruction on the social skill acquisition and generalization of 6 urban African American sixth graders with emotional and behavioral disorders (EBD). A multiple-probe across participants design was used to evaluate the effects of the social skills…

  20. Becoming Technosocial Change Agents: Intersectionality and Culturally Responsive Pedagogies as Vital Resources for Increasing Girls' Participation in Computing

    Science.gov (United States)

    Ashcraft, Catherine; Eger, Elizabeth K.; Scott, Kimberly A.

    2017-01-01

    Drawing from our two-year ethnography, we juxtapose the experiences of two cohorts in one culturally responsive computing program, examining how the program fostered girls' emerging identities as technosocial change agents. In presenting this in-depth and up-close exploration, we simultaneously identify conditions that both facilitated and limited…

  1. RESPONSE FUNCTIONS FOR COMPUTING ABSORBED DOSE TO SKELETAL TISSUES FROM NEUTRON IRRADIATION

    Science.gov (United States)

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2016-01-01

    Spongiosa in the adult human skeleton consists of three tissues - active marrow (AM), inactive marrow (IM), and trabecularized mineral bone (TB). Active marrow is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues laying within the first 50 μm the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent microCT imaging of a 40-year-old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton [Hough et al PMB (2011)]. This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fractions (SAF) values for protons originating in axial and appendicular bone sites [Jokisch et al PMB (submitted)]. These proton SAFs, bone masses, tissue compositions, and proton production cross-sections, were subsequently used to construct neutron dose response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, active marrow, total shallow marrow, and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged particle equilibrium (CPE) is established across the bone site. In the range of 10 eV to 100 Me

  2. Computation method for available response time due to tripping at minimum foot clearance.

    Science.gov (United States)

    Nagano, H; Begg, R; Sparrow, W A

    2013-01-01

    Falls prevention is important for older individuals to maintain healthy lifestyles and is an essential challenge in sustaining the socioeconomic structure of many advanced nations. Tripping has been recognized as the largest cause of falls and accordingly, understanding tripping-induced anterior balance loss is necessary in reducing the overall frequency of falls among older adults. Hazardous anterior balance loss due to tripping can be attributed to the mid-swing phase event, minimum foot clearance (MFC). The mechanism of tripping-induced anterior balance loss can be described as anterior movement of the center of mass (CoM) passing the frontal boundary of the supporting base between the swing and stance toes. The first aim of the current study was to establish a computational method for determining available response time (ART) to anterior balance loss due to tripping at MFC, in other words, the time taken for CoM to reach the anterior boundary and therefore, the time limit for balance recovery. Kinematic information of CoM and both toes in addition to simulated impact force due to tripping at MFC were used to estimate ART. The second aim was to apply correlation analysis to a range of gait parameters to identify the factors influencing ART. ART for balance loss in the forward direction due to tripping was on average. 0.11s for both the dominant and non-dominant limbs' simulated tripping at MFC. Correlation analysis revealed five factors at MFC that prolong ART including: 1) greater fore-aft distance from CoM to stance toe, 2) greater sideway distance from CoM to swing toe, 3) longer distance from CoM to the frontal boundary of the supporting base, 4) slower CoM forward velocity and 5) slower horizontal toe velocity. The established ART computation method can be utilized to examine the effects of ageing and various gait tasks on the likelihood of tripping-induced anterior balance loss and associated falls.

  3. Positron emission tomography/computed tomography and biomarkers for early treatment response evaluation in metastatic colon cancer

    DEFF Research Database (Denmark)

    Engelmann, Bodil E.; Loft, Annika; Kjær, Andreas

    2014-01-01

    BACKGROUND: Treatment options for metastatic colon cancer (mCC) are widening. We prospectively evaluated serial 2-deoxy-2-[18F]fluoro-d-glucose positron-emission tomography/computed tomography (PET/CT) and measurements of tissue inhibitor of metalloproteinases-1 (TIMP-1), carcinoembryonic antigen...... evaluated by PET/CT before treatment, after one and four treatment series. Morphological and metabolic response was independently assessed according to Response Evaluation Criteria in Solid Tumors and European Organization for Research and Treatment of Cancer PET criteria. Plasma TIMP-1, plasma u...

  4. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  5. Evaluation of linearity of response and angular dependence of an ionization chamber for dosimetry in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Perini, Ana P.; Neves, Lucio P.; Xavier, Marcos; Caldas, Linda V.E., E-mail: mxavier@ipen.b, E-mail: lcaldas@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Khoury, Helen J., E-mail: khoury@ufpe.b [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear

    2011-07-01

    In this paper a pencil-type ionization chamber designed and manufactured at Instituto de Pesquisas Energeticas e Nucleares was evaluated for dosimetric applications in computed tomography beams. To evaluate the performance of this chamber two tests were undertaken: linearity of response and angular dependence. The results obtained in these tests showed good results, within the international recommendations. Moreover, this homemade ionization chamber is easy to manufacture, of low cost and efficient. (author)

  6. Error processing and response inhibition in excessive computer game players: an event-related potential study

    NARCIS (Netherlands)

    Littel, M.; Berg, I. van den; Luijten, M.; Rooij, A.J. van; Keemink, L.; Franken, I.H.A.

    2012-01-01

    Excessive computer gaming has recently been proposed as a possible pathological illness. However, research on this topic is still in its infancy and underlying neurobiological mechanisms have not yet been identified. The determination of underlying mechanisms of excessive gaming might be useful for

  7. Computation-Guided Design of a Stimulus-Responsive Multienzyme Supramolecular Assembly.

    Science.gov (United States)

    Yang, Lu; Dolan, Elliott M; Tan, Sophia K; Lin, Tianyun; Sontag, Eduardo D; Khare, Sagar D

    2017-10-18

    The construction of stimulus-responsive supramolecular complexes of metabolic pathway enzymes, inspired by natural multienzyme assemblies (metabolons), provides an attractive avenue for efficient and spatiotemporally controllable one-pot biotransformations. We have constructed a phosphorylation- and optically responsive metabolon for the biodegradation of the environmental pollutant 1,2,3-trichloropropane. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Computational micromechanics analysis of electron hopping and interfacial damage induced piezoresistive response in carbon nanotube-polymer nanocomposites

    International Nuclear Information System (INIS)

    Chaurasia, A K; Seidel, G D; Ren, X

    2014-01-01

    Carbon nanotube (CNT)-polymer nanocomposites have been observed to exhibit an effective macroscale piezoresistive response, i.e., change in macroscale resistivity when subjected to applied deformation. The macroscale piezoresistive response of CNT-polymer nanocomposites leads to deformation/strain sensing capabilities. It is believed that the nanoscale phenomenon of electron hopping is the major driving force behind the observed macroscale piezoresistivity of such nanocomposites. Additionally, CNT-polymer nanocomposites provide damage sensing capabilities because of local changes in electron hopping pathways at the nanoscale because of initiation/evolution of damage. The primary focus of the current work is to explore the effect of interfacial separation and damage at the nanoscale CNT-polymer interface on the effective macroscale piezoresistive response. Interfacial separation and damage are allowed to evolve at the CNT-polymer interface through coupled electromechanical cohesive zones, within a finite element based computational micromechanics framework, resulting in electron hopping based current density across the separated CNT-polymer interface. The macroscale effective material properties and gauge factors are evaluated using micromechanics techniques based on electrostatic energy equivalence. The impact of the electron hopping mechanism, nanoscale interface separation and damage evolution on the effective nanocomposite electrostatic and piezoresistive response is studied in comparison with the perfectly bonded interface. The effective electrostatic/piezoresistive response for the perfectly bonded interface is obtained based on a computational micromechanics model developed in the authors’ earlier work. It is observed that the macroscale effective gauge factors are highly sensitive to strain induced formation/disruption of electron hopping pathways, interface separation and the initiation/evolution of interfacial damage. (paper)

  9. Computer programs for calculation of matrix stability and frequency response from a state-space system description

    Science.gov (United States)

    Seidel, R. C.

    1974-01-01

    FORTRAN computer subroutines stemming from requirements to process state variable system equations for systems of high order are presented. They find the characteristic equation of a matrix using the method of Danilevsky, the number of roots with positive real parts using the Routh-Horwitz alternate formulation, convert a state variable system description to a Laplace transfer function using the method of Bollinger, and evaluate that transfer function and obtain its frequency response. A sample problem is presented to demonstrate use of the subroutines.

  10. How the Interval between Prime and Boost Injection Affects the Immune Response in a Computational Model of the Immune System

    Directory of Open Access Journals (Sweden)

    F. Castiglione

    2012-01-01

    Full Text Available The immune system is able to respond more vigorously to the second contact with a given antigen than to the first contact. Vaccination protocols generally include at least two doses, in order to obtain high antibody titers. We want to analyze the relation between the time elapsed from the first dose (priming and the second dose (boost on the antibody titers. In this paper, we couple in vivo experiments with computer simulations to assess the effect of delaying the second injection. We observe that an interval of several weeks between the prime and the boost is necessary to obtain optimal antibody responses.

  11. Computed tomographic detection of sinusitis responsible for intracranial and extracranial infections

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.L.; Bankoff, M.S.; Fisk, J.D.

    1983-06-01

    Computed tomography (CT) is now used extensively for the evaluation of orbital, facial, and intracranial infections. Nine patients are presented to illustrate the importance of detecting underlying and unsuspected sinusitis. Prompt treatment of the sinusitis is essential to minimize the morbidity and mortality associated with complications such as brain abscess, meningitis, orbital cellulitis, and osteomyelitis. A review of the literature documents the persistence of these complications despite the widespread use of antibiotic therapy. Recognition of the underlying sinusitis is now possible with CT if the region of the sinuses is included and bone-window settings are used during the examination of patients with orbital and intracranial infection.

  12. Response Surface Modeling of Combined-Cycle Propulsion Components using Computational Fluid Dynamics

    Science.gov (United States)

    Steffen, C. J., Jr.

    2002-01-01

    Three examples of response surface modeling with CFD are presented for combined cycle propulsion components. The examples include a mixed-compression-inlet during hypersonic flight, a hydrogen-fueled scramjet combustor during hypersonic flight, and a ducted-rocket nozzle during all-rocket flight. Three different experimental strategies were examined, including full factorial, fractionated central-composite, and D-optimal with embedded Plackett-Burman designs. The response variables have been confined to integral data extracted from multidimensional CFD results. Careful attention to uncertainty assessment and modeling bias has been addressed. The importance of automating experimental setup and effectively communicating statistical results are emphasized.

  13. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  14. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II) user's manual

    International Nuclear Information System (INIS)

    David P. Colton

    2007-01-01

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time

  15. Evaluation of response variables in computer-simulated virtual cataract surgery

    Science.gov (United States)

    Söderberg, Per G.; Laurell, Carl-Gustaf; Simawi, Wamidh; Nordqvist, Per; Skarman, Eva; Nordh, Leif

    2006-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at evaluating the precision in the estimation of response variables identified for measurement of the performance of VR phaco surgery. We identified 31 response variables measuring; the overall procedure, the foot pedal technique, the phacoemulsification technique, erroneous manipulation, and damage to ocular structures. Totally, 8 medical or optometry students with a good knowledge of ocular anatomy and physiology but naive to cataract surgery performed three sessions each of VR Phaco surgery. For measurement, the surgical procedure was divided into a sculpting phase and an evacuation phase. The 31 response variables were measured for each phase in all three sessions. The variance components for individuals and iterations of sessions within individuals were estimated with an analysis of variance assuming a hierarchal model. The consequences of estimated variabilities for sample size requirements were determined. It was found that generally there was more variability for iterated sessions within individuals for measurements of the sculpting phase than for measurements of the evacuation phase. This resulted in larger required sample sizes for detection of difference between independent groups or change within group, for the sculpting phase as compared to for the evacuation phase. It is concluded that several of the identified response variables can be measured with sufficient precision for evaluation of VR phaco surgery.

  16. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  17. Computing level-impulse responses of log-specified VAR systems

    NARCIS (Netherlands)

    Wieringa, J.E.; Horvath, C.

    2005-01-01

    Impulse response functions (IRFs) are often used to analyze the dynamic behavior of a vector autoregressive (VAR) system. In many applications of VAR modelling, the variables are log-transformed before the model is estimated. If this is the case, the results of the IRFs do not have a direct

  18. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Science.gov (United States)

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  19. Computational Analysis Reveals a Key Regulator of Cryptococcal Virulence and Determinant of Host Response

    Directory of Open Access Journals (Sweden)

    Stacey R. Gish

    2016-04-01

    Full Text Available Cryptococcus neoformans is a ubiquitous, opportunistic fungal pathogen that kills over 600,000 people annually. Here, we report integrated computational and experimental investigations of the role and mechanisms of transcriptional regulation in cryptococcal infection. Major cryptococcal virulence traits include melanin production and the development of a large polysaccharide capsule upon host entry; shed capsule polysaccharides also impair host defenses. We found that both transcription and translation are required for capsule growth and that Usv101 is a master regulator of pathogenesis, regulating melanin production, capsule growth, and capsule shedding. It does this by directly regulating genes encoding glycoactive enzymes and genes encoding three other transcription factors that are essential for capsule growth: GAT201, RIM101, and SP1. Murine infection with cryptococci lacking Usv101 significantly alters the kinetics and pathogenesis of disease, with extended survival and, unexpectedly, death by pneumonia rather than meningitis. Our approaches and findings will inform studies of other pathogenic microbes.

  20. An integrated computational framework for simulating the failure response of carbon fiber reinforced polymer composites

    Science.gov (United States)

    Ahmadian, Hossein; Liang, Bowen; Soghrati, Soheil

    2017-12-01

    A new computational framework is introduced for the automated finite element (FE) modeling of fiber reinforced composites and simulating their micromechanical behavior. The proposed methodology relies on a new microstructure reconstruction algorithm that implements the centroidal Voronoi tessellation (CVT) to generate an initial uniform distribution of fibers with desired volume fraction and size distribution in a repeating unit cell of the composite. The genetic algorithm (GA) is then employed to optimize locations of fibers such that they replicate the target spatial arrangement. We also use a non-iterative mesh generation algorithm, named conforming to interface structured adaptive mesh refinement (CISAMR), to create FE models of the CFRPC. The CVT-GA-CISAMR framework is then employed to investigate the appropriate size of the composite's representative volume element. We also study the strength and failure mechanisms in the CFRPC subject to varying uniaxial and mixed-mode loadings.

  1. Sensorimotor rhythm-based brain-computer interface training: the impact on motor cortical responsiveness

    Science.gov (United States)

    Pichiorri, F.; De Vico Fallani, F.; Cincotti, F.; Babiloni, F.; Molinari, M.; Kleih, S. C.; Neuper, C.; Kübler, A.; Mattia, D.

    2011-04-01

    The main purpose of electroencephalography (EEG)-based brain-computer interface (BCI) technology is to provide an alternative channel to support communication and control when motor pathways are interrupted. Despite the considerable amount of research focused on the improvement of EEG signal detection and translation into output commands, little is known about how learning to operate a BCI device may affect brain plasticity. This study investigated if and how sensorimotor rhythm-based BCI training would induce persistent functional changes in motor cortex, as assessed with transcranial magnetic stimulation (TMS) and high-density EEG. Motor imagery (MI)-based BCI training in naïve participants led to a significant increase in motor cortical excitability, as revealed by post-training TMS mapping of the hand muscle's cortical representation; peak amplitude and volume of the motor evoked potentials recorded from the opponens pollicis muscle were significantly higher only in those subjects who develop a MI strategy based on imagination of hand grasping to successfully control a computer cursor. Furthermore, analysis of the functional brain networks constructed using a connectivity matrix between scalp electrodes revealed a significant decrease in the global efficiency index for the higher-beta frequency range (22-29 Hz), indicating that the brain network changes its topology with practice of hand grasping MI. Our findings build the neurophysiological basis for the use of non-invasive BCI technology for monitoring and guidance of motor imagery-dependent brain plasticity and thus may render BCI a viable tool for post-stroke rehabilitation.

  2. Computer program for analysis of hemodynamic response to head-up tilt test

    Science.gov (United States)

    ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor

    2014-11-01

    The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.

  3. Computational modeling predicts the ionic mechanism of late-onset responses in unipolar brush cells.

    Science.gov (United States)

    Subramaniyam, Sathyaa; Solinas, Sergio; Perin, Paola; Locatelli, Francesca; Masetto, Sergio; D'Angelo, Egidio

    2014-01-01

    Unipolar Brush Cells (UBCs) have been suggested to play a critical role in cerebellar functioning, yet the corresponding cellular mechanisms remain poorly understood. UBCs have recently been reported to generate, in addition to early-onset glutamate receptor-dependent synaptic responses, a late-onset response (LOR) composed of a slow depolarizing ramp followed by a spike burst (Locatelli et al., 2013). The LOR activates as a consequence of synaptic activity and involves an intracellular cascade modulating H- and TRP-current gating. In order to assess the LOR mechanisms, we have developed a UBC multi-compartmental model (including soma, dendrite, initial segment, and axon) incorporating biologically realistic representations of ionic currents and a cytoplasmic coupling mechanism regulating TRP and H channel gating. The model finely reproduced UBC responses to current injection, including a burst triggered by a low-threshold spike (LTS) sustained by CaLVA currents, a persistent discharge sustained by CaHVA currents, and a rebound burst following hyperpolarization sustained by H- and CaLVA-currents. Moreover, the model predicted that H- and TRP-current regulation was necessary and sufficient to generate the LOR and its dependence on the intensity and duration of mossy fiber activity. Therefore, the model showed that, using a basic set of ionic channels, UBCs generate a rich repertoire of bursts, which could effectively implement tunable delay-lines in the local microcircuit.

  4. A novel computational approach of image analysis to quantify behavioural response to heat shock in Chironomus Ramosus larvae (Diptera: Chironomidae

    Directory of Open Access Journals (Sweden)

    Bimalendu B. Nath

    2015-07-01

    Full Text Available All living cells respond to temperature stress through coordinated cellular, biochemical and molecular events known as “heat shock response” and its genetic basis has been found to be evolutionarily conserved. Despite marked advances in stress research, this ubiquitous heat shock response has never been analysed quantitatively at the whole organismal level using behavioural correlates. We have investigated behavioural response to heat shock in a tropical midge Chironomus ramosus Chaudhuri, Das and Sublette. The filter-feeding aquatic Chironomus larvae exhibit characteristic undulatory movement. This innate pattern of movement was taken as a behavioural parameter in the present study. We have developed a novel computer-aided image analysis tool “Chiro” for the quantification of behavioural responses to heat shock. Behavioural responses were quantified by recording the number of undulations performed by each larva per unit time at a given ambient temperature. Quantitative analysis of undulation frequency was carried out and this innate behavioural pattern was found to be modulated as a function of ambient temperature. Midge larvae are known to be bioindicators of aquatic environments. Therefore, the “Chiro” technique can be tested using other potential biomonitoring organisms obtained from natural aquatic habitats using undulatory motion as a behavioural parameter.

  5. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  6. Oxygen Modulates the Effectiveness of Granuloma Mediated Host Response to Mycobacterium tuberculosis: a Multi-scale Computational Biological Approach

    Directory of Open Access Journals (Sweden)

    Cheryl L. Sershen

    2016-02-01

    Full Text Available In several mammalian hosts, including non-human primates, Mycobacterium tuberculosis granulomas are often hypoxic, although this has not been observed in wild type murine infection models. Mtb associated granuloma formation can be viewed as a structural immune response that can contain and halt the spread of the pathogen. While a presumed consequence, the structural contribution of the granuloma to oxygen limitation and the concomitant impact on Mtb metabolic viability and persistence remains to be fully explored.We develop a multi-scale computational model to test to what extent in vivo Mtb granulomas become hypoxic, and investigate the effects of hypoxia on host immune response efficacy and mycobacterial persistence. Our study integrates a model of oxygen dynamics in the extracellular space of alveolar tissue, an agent-based model of cellular immune response, and a systems biology-based model of Mtb metabolic dynamics. Our theoretical studies suggest that the dynamics of granuloma organization mediates oxygen availability and illustrates the immunological contribution of this structural host response to infection outcome. Furthermore, our integrated model demonstrates the link between structural immune response and mechanistic drivers influencing Mtb's adaptation to its changing microenvironment and the qualitative infection outcome scenarios: clearance, containment, dissemination, and a newly observed theoretical outcome of transient containment. We observed hypoxic regions in the containment granuloma similar in size to granulomas found in mammalian in vivo models of Mtb infection. In the case of the containment outcome, our model uniquely demonstrates that immune response mediated hypoxic conditions help foster the shift down of bacteria through the two stages of adaptation similar to thein vitro non-replicating persistence (NRP observed in the Wayne model of Mtb dormancy. The adaptation in part contributes to the ability of Mtb to remain

  7. Oxygen Modulates the Effectiveness of Granuloma Mediated Host Response to Mycobacterium tuberculosis: A Multiscale Computational Biology Approach

    Science.gov (United States)

    Sershen, Cheryl L.; Plimpton, Steven J.; May, Elebeoba E.

    2016-01-01

    Mycobacterium tuberculosis associated granuloma formation can be viewed as a structural immune response that can contain and halt the spread of the pathogen. In several mammalian hosts, including non-human primates, Mtb granulomas are often hypoxic, although this has not been observed in wild type murine infection models. While a presumed consequence, the structural contribution of the granuloma to oxygen limitation and the concomitant impact on Mtb metabolic viability and persistence remains to be fully explored. We develop a multiscale computational model to test to what extent in vivo Mtb granulomas become hypoxic, and investigate the effects of hypoxia on host immune response efficacy and mycobacterial persistence. Our study integrates a physiological model of oxygen dynamics in the extracellular space of alveolar tissue, an agent-based model of cellular immune response, and a systems biology-based model of Mtb metabolic dynamics. Our theoretical studies suggest that the dynamics of granuloma organization mediates oxygen availability and illustrates the immunological contribution of this structural host response to infection outcome. Furthermore, our integrated model demonstrates the link between structural immune response and mechanistic drivers influencing Mtbs adaptation to its changing microenvironment and the qualitative infection outcome scenarios of clearance, containment, dissemination, and a newly observed theoretical outcome of transient containment. We observed hypoxic regions in the containment granuloma similar in size to granulomas found in mammalian in vivo models of Mtb infection. In the case of the containment outcome, our model uniquely demonstrates that immune response mediated hypoxic conditions help foster the shift down of bacteria through two stages of adaptation similar to thein vitro non-replicating persistence (NRP) observed in the Wayne model of Mtb dormancy. The adaptation in part contributes to the ability of Mtb to remain dormant

  8. Experimental and computational models of neurite extension at a choice point in response to controlled diffusive gradients

    Science.gov (United States)

    Catig, G. C.; Figueroa, S.; Moore, M. J.

    2015-08-01

    Ojective. Axons are guided toward desired targets through a series of choice points that they navigate by sensing cues in the cellular environment. A better understanding of how microenvironmental factors influence neurite growth during development can inform strategies to address nerve injury. Therefore, there is a need for biomimetic models to systematically investigate the influence of guidance cues at such choice points. Approach. We ran an adapted in silico biased turning axon growth model under the influence of nerve growth factor (NGF) and compared the results to corresponding in vitro experiments. We examined if growth simulations were predictive of neurite population behavior at a choice point. We used a biphasic micropatterned hydrogel system consisting of an outer cell restrictive mold that enclosed a bifurcated cell permissive region and placed a well near a bifurcating end to allow proteins to diffuse and form a gradient. Experimental diffusion profiles in these constructs were used to validate a diffusion computational model that utilized experimentally measured diffusion coefficients in hydrogels. The computational diffusion model was then used to establish defined soluble gradients within the permissive region of the hydrogels and maintain the profiles in physiological ranges for an extended period of time. Computational diffusion profiles informed the neurite growth model, which was compared with neurite growth experiments in the bifurcating hydrogel constructs. Main results. Results indicated that when applied to the constrained choice point geometry, the biased turning model predicted experimental behavior closely. Results for both simulated and in vitro neurite growth studies showed a significant chemoattractive response toward the bifurcated end containing an NGF gradient compared to the control, though some neurites were found in the end with no NGF gradient. Significance. The integrated model of neurite growth we describe will allow

  9. Computational methods for predicting the response of critical as-built infrastructure to dynamic loads (architectural surety)

    Energy Technology Data Exchange (ETDEWEB)

    Preece, D.S.; Weatherby, J.R.; Attaway, S.W.; Swegle, J.W.; Matalucci, R.V.

    1998-06-01

    Coupled blast-structural computational simulations using supercomputer capabilities will significantly advance the understanding of how complex structures respond under dynamic loads caused by explosives and earthquakes, an understanding with application to the surety of both federal and nonfederal buildings. Simulation of the effects of explosives on structures is a challenge because the explosive response can best be simulated using Eulerian computational techniques and structural behavior is best modeled using Lagrangian methods. Due to the different methodologies of the two computational techniques and code architecture requirements, they are usually implemented in different computer programs. Explosive and structure modeling in two different codes make it difficult or next to impossible to do coupled explosive/structure interaction simulations. Sandia National Laboratories has developed two techniques for solving this problem. The first is called Smoothed Particle Hydrodynamics (SPH), a relatively new gridless method comparable to Eulerian, that is especially suited for treating liquids and gases such as those produced by an explosive. The SPH capability has been fully implemented into the transient dynamics finite element (Lagrangian) codes PRONTO-2D and -3D. A PRONTO-3D/SPH simulation of the effect of a blast on a protective-wall barrier is presented in this paper. The second technique employed at Sandia National Laboratories uses a relatively new code called ALEGRA which is an ALE (Arbitrary Lagrangian-Eulerian) wave code with specific emphasis on large deformation and shock propagation. ALEGRA is capable of solving many shock-wave physics problems but it is especially suited for modeling problems involving the interaction of decoupled explosives with structures.

  10. Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis

    Directory of Open Access Journals (Sweden)

    F. Joos

    2013-03-01

    Full Text Available The responses of carbon dioxide (CO2 and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP and Global Temperature change Potential (GTP, to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%. The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence lies within the range of (68 to 117 × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and

  11. Computation of the Response Surface in the Tensor Train data format

    KAUST Repository

    Dolgov, Sergey

    2014-06-11

    We apply the Tensor Train (TT) approximation to construct the Polynomial Chaos Expansion (PCE) of a random field, and solve the stochastic elliptic diffusion PDE with the stochastic Galerkin discretization. We compare two strategies of the polynomial chaos expansion: sparse and full polynomial (multi-index) sets. In the full set, the polynomial orders are chosen independently in each variable, which provides higher flexibility and accuracy. However, the total amount of degrees of freedom grows exponentially with the number of stochastic coordinates. To cope with this curse of dimensionality, the data is kept compressed in the TT decomposition, a recurrent low-rank factorization. PCE computations on sparse grids sets are extensively studied, but the TT representation for PCE is a novel approach that is investigated in this paper. We outline how to deduce the PCE from the covariance matrix, assemble the Galerkin operator, and evaluate some post-processing (mean, variance, Sobol indices), staying within the low-rank framework. The most demanding are two stages. First, we interpolate PCE coefficients in the TT format using a few number of samples, which is performed via the block cross approximation method. Second, we solve the discretized equation (large linear system) via the alternating minimal energy algorithm. In the numerical experiments we demonstrate that the full expansion set encapsulated in the TT format is indeed preferable in cases when high accuracy and high polynomial orders are required.

  12. Understanding the Value of a Computer Emergency Response Capability for Nuclear Security

    Energy Technology Data Exchange (ETDEWEB)

    Gasper, Peter Donald [Idaho National Laboratory; Rodriguez, Julio Gallardo [Idaho National Laboratory

    2015-06-01

    The international nuclear community has a great understanding of the physical security needs relating to the prevention, detection, and response of malicious acts associated with nuclear facilities and radioactive material. International Atomic Energy Agency (IAEA) Nuclear Security Recommendations (INFCIRC_225_Rev 5) outlines specific guidelines and recommendations for implementing and maintaining an organization’s nuclear security posture. An important element for inclusion into supporting revision 5 is the establishment of a “Cyber Emergency Response Team (CERT)” focused on the international communities cybersecurity needs to maintain a comprehensive nuclear security posture. Cybersecurity and the importance of nuclear cybersecurity require that there be a specific focus on developing an International Nuclear CERT (NS-CERT). States establishing contingency plans should have an understanding of the cyber threat landscape and the potential impacts to systems in place to protect and mitigate malicious activities. This paper will outline the necessary components, discuss the relationships needed within the international community, and outline a process by which the NS-CERT identifies, collects, processes, and reports critical information in order to establish situational awareness (SA) and support decision-making

  13. Hydrologic Response to Climate Change: Missing Precipitation Data Matters for Computed Timing Trends

    Science.gov (United States)

    Daniels, B.

    2016-12-01

    This work demonstrates the derivation of climate timing statistics and applying them to determine resulting hydroclimate impacts. Long-term daily precipitation observations from 50 California stations were used to compute climate trends of precipitation event Intensity, event Duration and Pause between events. Each precipitation event trend was then applied as input to a PRMS hydrology model which showed hydrology changes to recharge, baseflow, streamflow, etc. An important concern was precipitation uncertainty induced by missing observation values and causing errors in quantification of precipitation trends. Many standard statistical techniques such as ARIMA and simple endogenous or even exogenous imputation were applied but failed to help resolve these uncertainties. What helped resolve these uncertainties was use of multiple imputation techniques. This involved fitting of Weibull probability distributions to multiple imputed values for the three precipitation trends.Permutation resampling techniques using Monte Carlo processing were then applied to the multiple imputation values to derive significance p-values for each trend. Significance at the 95% level for Intensity was found for 11 of the 50 stations, Duration from 16 of the 50, and Pause from 19, of which 12 were 99% significant. The significance weighted trends for California are Intensity -4.61% per decade, Duration +3.49% per decade, and Pause +3.58% per decade. Two California basins with PRMS hydrologic models were studied: Feather River in the northern Sierra Nevada mountains and the central coast Soquel-Aptos. Each local trend was changed without changing the other trends or the total precipitation. Feather River Basin's critical supply to Lake Oroville and the State Water Project benefited from a total streamflow increase of 1.5%. The Soquel-Aptos Basin water supply was impacted by a total groundwater recharge decrease of -7.5% and streamflow decrease of -3.2%.

  14. Agent-based computational model investigates muscle-specific responses to disuse-induced atrophy

    Science.gov (United States)

    Martin, Kyle S.; Peirce, Shayn M.

    2015-01-01

    Skeletal muscle is highly responsive to use. In particular, muscle atrophy attributable to decreased activity is a common problem among the elderly and injured/immobile. However, each muscle does not respond the same way. We developed an agent-based model that generates a tissue-level skeletal muscle response to disuse/immobilization. The model incorporates tissue-specific muscle fiber architecture parameters and simulates changes in muscle fiber size as a result of disuse-induced atrophy that are consistent with published experiments. We created simulations of 49 forelimb and hindlimb muscles of the rat by incorporating eight fiber-type and size parameters to explore how these parameters, which vary widely across muscles, influence sensitivity to disuse-induced atrophy. Of the 49 muscles modeled, the soleus exhibited the greatest atrophy after 14 days of simulated immobilization (51% decrease in fiber size), whereas the extensor digitorum communis atrophied the least (32%). Analysis of these simulations revealed that both fiber-type distribution and fiber-size distribution influence the sensitivity to disuse atrophy even though no single tissue architecture parameter correlated with atrophy rate. Additionally, software agents representing fibroblasts were incorporated into the model to investigate cellular interactions during atrophy. Sensitivity analyses revealed that fibroblast agents have the potential to affect disuse-induced atrophy, albeit with a lesser effect than fiber type and size. In particular, muscle atrophy elevated slightly with increased initial fibroblast population and increased production of TNF-α. Overall, the agent-based model provides a novel framework for investigating both tissue adaptations and cellular interactions in skeletal muscle during atrophy. PMID:25722379

  15. [Positron emission tomography combined with computed tomography in the initial evaluation and response assessment in primary central nervous system lymphoma].

    Science.gov (United States)

    Mercadal, Santiago; Cortés-Romera, Montserrat; Vélez, Patricia; Climent, Fina; Gámez, Cristina; González-Barca, Eva

    2015-06-08

    To evaluate the role of positron emission tomography combined with computed tomography (PET-CT) in the initial evaluation and response assessment in primary central nervous system lymphoma (PCNSL). Fourteen patients (8 males) with a median age 59.5 years diagnosed of PCNSL. A brain PET-CT and magnetic resonance imaging (MRI) were performed in the initial evaluation. In 7 patients a PET-CT after treatment was performed. PET-CT showed at diagnosis 31 hypermetabolic focuses and MRI showed 47 lesions, with a good grade of concordance between both (k = 0.61; P = .005). In the response assessment, correlation between both techniques was good, and PET-CT was helpful in the appreciation of residual MRI lesions. Overall survival at 2 years of negative vs. positive PET-CT at the end of treatment was 100 vs. 37.5%, respectively (P = .045). PET-CT can be useful in the initial evaluation of PCNSL, and especially in the assessment of response. Despite the fact that PET-CT detects less small lesions than MRI, a good correlation between MRI and PET-CT was observed. It is effective in the evaluation of residual lesions. Prospective studies are needed to confirm their possible prognostic value. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  16. Striatal disorders dissociate mechanisms of enhanced and impaired response selection — Evidence from cognitive neurophysiology and computational modelling

    Directory of Open Access Journals (Sweden)

    Christian Beste

    2014-01-01

    Full Text Available Paradoxically enhanced cognitive processes in neurological disorders provide vital clues to understanding neural function. However, what determines whether the neurological damage is impairing or enhancing is unclear. Here we use the performance of patients with two disorders of the striatum to dissociate mechanisms underlying cognitive enhancement and impairment resulting from damage to the same system. In a two-choice decision task, Huntington's disease patients were faster and less error prone than controls, yet a patient with the rare condition of benign hereditary chorea (BHC was both slower and more error prone. EEG recordings confirmed significant differences in neural processing between the groups. Analysis of a computational model revealed that the common loss of connectivity between striatal neurons in BHC and Huntington's disease impairs response selection, but the increased sensitivity of NMDA receptors in Huntington's disease potentially enhances response selection. Crucially the model shows that there is a critical threshold for increased sensitivity: below that threshold, impaired response selection results. Our data and model thus predict that specific striatal malfunctions can contribute to either impaired or enhanced selection, and provide clues to solving the paradox of how Huntington's disease can lead to both impaired and enhanced cognitive processes.

  17. Computational modeling of the electromechanical response of a ventricular fiber affected by eccentric hypertrophy

    Directory of Open Access Journals (Sweden)

    Bianco Fabrizio Del

    2017-12-01

    Full Text Available The aim of this work is to study the effects of eccentric hypertrophy on the electromechanics of a single myocardial ventricular fiber by means of a one-dimensional finite-element strongly-coupled model. The electrical current ow model is written in the reference configuration and it is characterized by two geometric feedbacks, i.e. the conduction and convection ones, and by the mechanoelectric feedback due to stretchactivated channels. First, the influence of such feedbacks is investigated for both a healthy and a hypertrophic fiber in case of isometric simulations. No relevant discrepancies are found when disregarding one or more feedbacks for both fibers. Then, all feedbacks are taken into account while studying the electromechanical responses of fibers. The results from isometric tests do not point out any notable difference between the healthy and hypertrophic fibers as regards the action potential duration and conduction velocity. The length-tension relationships show increased stretches and reduced peak values for tension instead. The tension-velocity relationships derived from afterloaded isotonic and quick- release tests depict higher values of contraction velocity at smaller afterloads. Moreover, higher maximum shortenings are achieved during the isotonic contraction. In conclusion, our simulation results are innovative in predicting the electromechanical behavior of eccentric hypertrophic fibers.

  18. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  19. Fluctuation-dissipation relations and field-free algorithms for the computation of response functions

    Science.gov (United States)

    Corberi, Federico; Lippiello, Eugenio; Sarracino, Alessandro; Zannetti, Marco

    2010-01-01

    We discuss the relation between the fluctuation-dissipation relation derived by Chatelain and Ricci-Tersenghi [C. Chatelain, J. Phys. A 36, 10739 (2003); F. Ricci-Tersenghi, Phys. Rev. E 68, 065104(R) (2003)] and that by Lippiello-Corberi-Zannetti [E. Lippiello, F. Corberi, and M. Zannetti, Phys. Rev. E 71, 036104 (2005)]. In order to do that, we rederive the fluctuation-dissipation relation for systems of discrete variables evolving in discrete time via a stochastic nonequilibrium Markov process. The calculation is carried out in a general formalism comprising the Chatelain, Ricci-Tersenghi, result and that by Lippiello-Corberi-Zannetti as special cases. The applicability, generality, and experimental feasibility of the two approaches are thoroughly discussed. Extending the analytical calculation to the variance of the response function, we show the advantage of field-free numerical methods with respect to the standard method, where the perturbation is applied. We also show that the signal-to-noise ratio is better (by a factor 2 ) in the algorithm of Lippiello-Corberi-Zannetti with respect to that of Chatelain-Ricci Tersenghi.

  20. Responsibility Towards The Customers Of Subscription-Based Software Solutions In The Context Of Using The Cloud Computing Technology

    Directory of Open Access Journals (Sweden)

    Bogdan Ștefan Ionescu

    2003-12-01

    Full Text Available The continuously transformation of the contemporary society and IT environment circumscribed its informational has led to the emergence of the cloud computing technology that provides the access to infrastructure and subscription-based software services, as well. In the context of a growing number of service providers with of cloud software, the paper aims to identify the perception of some current or potential users of the cloud solution, selected from among students enrolled in the accounting (professional or research master programs with the profile organized by the Bucharest University of Economic Studies, in terms of their expectations for cloud services, as well as the extent to which the SaaS providers are responsible for the provided services.

  1. Contempt-LT: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Wheat, L.L.; Wagner, R.J.; Niederauer, G.F.; Obenchain, C.F.

    1975-06-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. CONTEMPT-LT can be used to model all current boiling water reactor pressure suppression systems, including containments with either vertical or horizontal vent systems. CONTEMPT-LT can also be used to model pressurized water reactor dry containments, subatmospheric containments, and dual volume containments with an annulus region, and can be used to describe containment responses in experimental containment systems. The program user defines which compartments are used, specifies input mass and energy additions, defines heat structure and leakage systems, and describes the time advancement and output control. CONTEMPT-LT source decks are available in double precision extended-binary-coded-decimal-interchange-code (EBCDIC) versions. Sample problems have been run on the IBM360/75 computer. (U.S.)

  2. Development of a computer program to quantify human error probability by integrating the diagnosis model and the response model

    International Nuclear Information System (INIS)

    Jung, W.D.; Kim, T.W.; Park, C.K.

    1991-01-01

    This paper presents an integrated approach to prediction of human error probabilities with a computer program, HREP (Human Reliability Evaluation Program). HREP is developed to provide simplicity in Human Reliability Analysis (HRA) and consistency in the obtained results. The basic assumption made in developing HREP is that human behaviors can be quantified in two separate steps. One is the diagnosis error evaluation step and the other the response error evaluation step. HREP integrates the Human Cognitive Reliability (HCR) model and the HRA Event Tree technique. The former corresponds to the Diagnosis model, and the latter the Response model. HREP consists of HREP-IN and HREP-MAIN. HREP-IN is used to generate input files. HREP-MAIN is used to evaluate selected human errors in a given input file. HREP-MAIN is divided into three subsections ; the diagnosis evaluation step, the subaction evaluation step and the modification step. The final modification step takes dependency and/or recovery factors into consideration. (author)

  3. Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) and Alberta Infant Motor Scale (AIMS): Validity and Responsiveness.

    Science.gov (United States)

    Dumas, Helene M; Fragala-Pinkham, Maria A; Rosen, Elaine L; Lombard, Kelly A; Farrell, Colleen

    2015-11-01

    Although preliminary studies have established a good psychometric foundation for the Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) for a broad population of youth with disabilities, additional validation is warranted for young children. The study objective was to (1) examine concurrent validity, (2) evaluate the ability to identify motor delay, and (3) assess responsiveness of the PEDI-CAT Mobility domain and the Alberta Infant Motor Scale (AIMS). Fifty-three infants and young children (<18 months of age) admitted to a pediatric postacute care hospital and referred for a physical therapist examination were included. The PEDI-CAT Mobility domain and the AIMS were completed during the initial physical therapist examination, at 3-month intervals, and at discharge. A Spearman rank correlation coefficient was used to examine concurrent validity. A chi-square analysis of age percentile scores was used to examine the identification of motor delay. Mean score differences from initial assessment to final assessment were analyzed to evaluate responsiveness. A statistically significant, fair association (rs=.313) was found for the 2 assessments. There was no significant difference in motor delay identification between tests; however, the AIMS had a higher percentage of infants with scores at or below the fifth percentile. Participants showed significant changes from initial testing to final testing on the PEDI-CAT Mobility domain and the AIMS. This study included only young patients (<18 months of age) in a pediatric postacute hospital; therefore, the generalizability is limited to this population. The PEDI-CAT Mobility domain is a valid measure for young children admitted to postacute care and is responsive to changes in motor skills. However, further item and standardization development is needed before the PEDI-CAT is used confidently to identify motor delay in children <18 months of age. © 2015 American Physical Therapy Association.

  4. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  5. Music and natural sounds in an auditory steady-state response based brain-computer interface to increase user acceptance.

    Science.gov (United States)

    Heo, Jeong; Baek, Hyun Jae; Hong, Seunghyeok; Chang, Min Hye; Lee, Jeong Su; Park, Kwang Suk

    2017-05-01

    Patients with total locked-in syndrome are conscious; however, they cannot express themselves because most of their voluntary muscles are paralyzed, and many of these patients have lost their eyesight. To improve the quality of life of these patients, there is an increasing need for communication-supporting technologies that leverage the remaining senses of the patient along with physiological signals. The auditory steady-state response (ASSR) is an electro-physiologic response to auditory stimulation that is amplitude-modulated by a specific frequency. By leveraging the phenomenon whereby ASSR is modulated by mind concentration, a brain-computer interface paradigm was proposed to classify the selective attention of the patient. In this paper, we propose an auditory stimulation method to minimize auditory stress by replacing the monotone carrier with familiar music and natural sounds for an ergonomic system. Piano and violin instrumentals were employed in the music sessions; the sounds of water streaming and cicadas singing were used in the natural sound sessions. Six healthy subjects participated in the experiment. Electroencephalograms were recorded using four electrodes (Cz, Oz, T7 and T8). Seven sessions were performed using different stimuli. The spectral power at 38 and 42Hz and their ratio for each electrode were extracted as features. Linear discriminant analysis was utilized to classify the selections for each subject. In offline analysis, the average classification accuracies with a modulation index of 1.0 were 89.67% and 87.67% using music and natural sounds, respectively. In online experiments, the average classification accuracies were 88.3% and 80.0% using music and natural sounds, respectively. Using the proposed method, we obtained significantly higher user-acceptance scores, while maintaining a high average classification accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A computational study to evaluate the activation pattern of nerve fibers in response to interferential currents stimulation.

    Science.gov (United States)

    Agharezaee, Mahsa; Mahnam, Amin

    2015-08-01

    Interferential current (IFC) is one of the most popular electrical currents used in electrotherapy. However, there have been limited studies investigating how this stimulation affects the nerve fibers. The aim of this computational study was to evaluate the temporal and spatial patterns of fiber activation in IFC therapy for different modulation and carrier frequencies. The interferential currents were applied by two pairs of point electrodes perpendicular to each other in an infinite homogeneous medium, and a model of myelinated nerve fibers was implemented in NEURON to study the neural response. The activation thresholds for different positions of the fiber and the resultant firing patterns were evaluated. The results suggest that the fibers may fire continuously or in bursts, with frequencies equal or higher than the modulation frequency, or may be blocked, based on their position relative to the electrodes, the modulation frequency and the stimulus strength. The results confirm traditional belief about the role of the modulation frequency in firing frequency of nerve fibers and describe a possible mechanism for less sensation of pain, due to blockage of the fibers by the high-frequency nature of the interferential currents.

  7. Evaluating a Computer Flash-Card Sight-Word Recognition Intervention with Self-Determined Response Intervals in Elementary Students with Intellectual Disability

    Science.gov (United States)

    Cazzell, Samantha; Skinner, Christopher H.; Ciancio, Dennis; Aspiranti, Kathleen; Watson, Tiffany; Taylor, Kala; McCurdy, Merilee; Skinner, Amy

    2017-01-01

    A concurrent multiple-baseline across-tasks design was used to evaluate the effectiveness of a computer flash-card sight-word recognition intervention with elementary-school students with intellectual disability. This intervention allowed the participants to self-determine each response interval and resulted in both participants acquiring…

  8. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    Science.gov (United States)

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is…

  9. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients

    DEFF Research Database (Denmark)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine

    2015-01-01

    BACKGROUND AND PURPOSE: This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. METHODS...

  10. Affect and the computer game player: the effect of gender, personality, and game reinforcement structure on affective responses to computer game-play

    OpenAIRE

    Chumbley, J; Griffiths, MD

    2006-01-01

    Previous research on computer games has tended to concentrate on their more negative effects (e.g., addiction, increased aggression). This study departs from the traditional clinical and social learning explanations for these behavioral phenomena and examines the effect of personality, in-game reinforcement characteristics, gender, and skill on the emotional state of the game-player. Results demonstrated that in-game reinforcement characteristics and skill significantly effect a number of aff...

  11. Early Assessment of Treatment Responses During Radiation Therapy for Lung Cancer Using Quantitative Analysis of Daily Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Paul, Jijo; Yang, Cungeng [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Wu, Hui [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); The Affiliated Cancer Hospital of Zhengzhou University, Zhengzhou (China); Tai, An [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dalah, Entesar [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Department of Medical Diagnostic Imaging, College of Health Science, University of Sharjah (United Arab Emirates); Zheng, Cheng [Biostatistics, Joseph. J. Zilber School of Public Health, University of Wisconsin-Milwaukee, Milwaukee, Wisconsin (United States); Johnstone, Candice [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Kong, Feng-Ming [Department of Radiation Oncology, Indiana University, Indianapolis, Indiana (United States); Gore, Elizabeth [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Li, X. Allen, E-mail: ali@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States)

    2017-06-01

    Purpose: To investigate early tumor and normal tissue responses during the course of radiation therapy (RT) for lung cancer using quantitative analysis of daily computed tomography (CT) scans. Methods and Materials: Daily diagnostic-quality CT scans acquired using CT-on-rails during CT-guided RT for 20 lung cancer patients were quantitatively analyzed. On each daily CT set, the contours of the gross tumor volume (GTV) and lungs were generated and the radiation dose delivered was reconstructed. The changes in CT image intensity (Hounsfield unit [HU]) features in the GTV and the multiple normal lung tissue shells around the GTV were extracted from the daily CT scans. The associations between the changes in the mean HUs, GTV, accumulated dose during RT delivery, and patient survival rate were analyzed. Results: During the RT course, radiation can induce substantial changes in the HU histogram features on the daily CT scans, with reductions in the GTV mean HUs (dH) observed in the range of 11 to 48 HU (median 30). The dH is statistically related to the accumulated GTV dose (R{sup 2} > 0.99) and correlates weakly with the change in GTV (R{sup 2} = 0.3481). Statistically significant increases in patient survival rates (P=.038) were observed for patients with a higher dH in the GTV. In the normal lung, the 4 regions proximal to the GTV showed statistically significant (P<.001) HU reductions from the first to last fraction. Conclusion: Quantitative analysis of the daily CT scans indicated that the mean HUs in lung tumor and surrounding normal tissue were reduced during RT delivery. This reduction was observed in the early phase of the treatment, is patient specific, and correlated with the delivered dose. A larger HU reduction in the GTV correlated significantly with greater patient survival. The changes in daily CT features, such as the mean HU, can be used for early assessment of the radiation response during RT delivery for lung cancer.

  12. Affect and the computer game player: the effect of gender, personality, and game reinforcement structure on affective responses to computer game-play.

    Science.gov (United States)

    Chumbley, Justin; Griffiths, Mark

    2006-06-01

    Previous research on computer games has tended to concentrate on their more negative effects (e.g., addiction, increased aggression). This study departs from the traditional clinical and social learning explanations for these behavioral phenomena and examines the effect of personality, in-game reinforcement characteristics, gender, and skill on the emotional state of the game-player. Results demonstrated that in-game reinforcement characteristics and skill significantly effect a number of affective measures (most notably excitement and frustration). The implications of the impact of game-play on affect are discussed with reference to the concepts of "addiction" and "aggression."

  13. Response

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Neuromorphic silicon chips have been developed over the last 30 years, inspired by the design of biological nervous systems and offering an alternative paradigm for computation, with real-time massively parallel operation and potentially large power savings with respect to conventional computing architectures. I will present the general principles with a brief investigation of the design choices that have been explored, and I'll discuss how such hardware has been applied to problems such as classification.

  14. Note on: 'EMLCLLER-A program for computing the EM response of a large loop source over a layered earth model' by N.P. Singh and T. Mogi, Computers & Geosciences 29 (2003) 1301-1307

    Science.gov (United States)

    Jamie, Majid

    2016-11-01

    Singh and Mogi (2003) presented a forward modeling (FWD) program, coded in FORTRAN 77 called "EMLCLLER", which is capable of computing the frequency-domain electromagnetic (EM) response of a large circular loop, in terms of vertical magnetic component (Hz), over 1D layer earth models; computations at this program could be performed by assuming variable transmitter-receiver configurations and incorporating both conduction and displacement currents into computations. Integral equations at this program are computed through digital linear filters based on the Hankel transforms together with analytic solutions based on hyper-geometric functions. Despite capabilities of EMLCLLER, there are some mistakes at this program that make its FWD results unreliable. The mistakes in EMLCLLER arise in using wrong algorithm for computing reflection coefficient of the EM wave in TE-mode (rTE), and using flawed algorithms for computing phase and normalized phase values relating to Hz; in this paper corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect FWD results, EMLCLLER and corrected version of this program presented in this paper titled "EMLCLLER_Corr" are conducted on different two- and three-layered earth models; afterwards their FWD results in terms of real and imaginary parts of Hz, its normalized amplitude, and the corresponding normalized phase curves are plotted versus frequency and compared to each other. In addition, in Singh and Mogi (2003) extra derivations for computing radial component of the magnetic field (Hr) and angular component of the electric field (Eϕ) are also presented where the numerical solution presented for Hr is incorrect; in this paper the correct numerical solution for this derivation is also presented.

  15. Calculating buoy response for a wave energy converter—A comparison of two computational methods and experimental results

    Directory of Open Access Journals (Sweden)

    Linnea Sjökvist

    2017-05-01

    Full Text Available When designing a wave power plant, reliable and fast simulation tools are required. Computational fluid dynamics (CFD software provides high accuracy but with a very high computational cost, and in operational, moderate sea states, linear potential flow theories may be sufficient to model the hydrodynamics. In this paper, a model is built in COMSOL Multiphysics to solve for the hydrodynamic parameters of a point-absorbing wave energy device. The results are compared with a linear model where the hydrodynamical parameters are computed using WAMIT, and to experimental results from the Lysekil research site. The agreement with experimental data is good for both numerical models.

  16. Evaluating a computer flash-card sight-word recognition intervention with self-determined response intervals in elementary students with intellectual disability.

    Science.gov (United States)

    Cazzell, Samantha; Skinner, Christopher H; Ciancio, Dennis; Aspiranti, Kathleen; Watson, Tiffany; Taylor, Kala; McCurdy, Merilee; Skinner, Amy

    2017-09-01

    A concurrent multiple-baseline across-tasks design was used to evaluate the effectiveness of a computer flash-card sight-word recognition intervention with elementary-school students with intellectual disability. This intervention allowed the participants to self-determine each response interval and resulted in both participants acquiring previously unknown words across all word sets. Discussion focuses on the need to evaluate and compare computer flash-card sight-word recognition interventions with fixed and self-determined response intervals across students and dependent variables, including rates of inappropriate behavior and self-determination in students with intellectual disability. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Pancreatic tuberculosis: Evaluation of therapeutic response using F-18 fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography

    International Nuclear Information System (INIS)

    Santhosh, Sampath; Bhattacharya, Anish; Rana, Surinder Singh; Bhasin, Deepak Kumar; Srinivasan, Radhika; Mittal, Bhagwant Rai

    2014-01-01

    F-18 fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography (FDG PET/CT) is a functional imaging technique that monitors glucose metabolism in tissues. Pulmonary tuberculosis (TB) has been reported to show intense uptake of FDG, with a decrease in metabolism of the tuberculous lesions after successful anti-tubercular treatment (ATT). The authors present a patient with pancreatic TB and demonstrate the usefulness of FDG PET/CT in monitoring the response to ATT

  18. The impact of irradiation dose on the computed tomography radiographic response of metastatic nodes and clinical outcomes in cervix cancer in a low-resource setting

    OpenAIRE

    McKeever, Matthew Ryan; Hwang, Lindsay; Barclay, Jennifer; Xi, Yin; Bailey, April; Albuquerque, Kevin

    2017-01-01

    Introduction: The aim of this study is to investigate the relationship between the radiation dose to pelvic and para-aortic lymph nodes, nodal response, and clinical outcomes in a resource-poor setting based on computed tomography (CT) nodal size alone. Materials and Methods: This retrospective study from 2009 to 2015 included 46 cervical cancer patients with 133 metastatic pelvic and para-aortic lymph nodes definitively treated with chemoradiation and brachytherapy in a public hospital with ...

  19. Comparison of contrast-enhanced ultrasound and contrast-enhanced computed tomography in evaluating the treatment response to transcatheter arterial chemoembolization of hepatocellular carcinoma using modified RECIST

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Ming; Lin, Man-xia; Xu, Zuo-feng; Wang, Wei; Xie, Xiao-yan [Sun Yat-Sen University, Department of Medical Ultrasonics, Guangzhou (China); Sun Yat-Sen University, The First Affiliated Hospital, Institute of Diagnostic and Interventional Ultrasound, Guangzhou (China); Lu, Ming-de; Kuang, Ming [Sun Yat-Sen University, Department of Hepatobiliary Surgery, Guangzhou (China); Sun Yat-Sen University, The First Affiliated Hospital, Institute of Diagnostic and Interventional Ultrasound, Guangzhou (China); Zheng, Ke-guo [Sun Yat-Sen University, Department of Radiology, Guangzhou (China); Zhuang, Wen-quan [Sun Yat-Sen University, Department of Interventional Radiology, Guangzhou (China)

    2015-08-15

    We aimed to compare contrast-enhanced ultrasound (CEUS) with contrast-enhanced computed tomography (CECT) for evaluating the treatment response to transcatheter arterial chemoembolization (TACE) of hepatocellular carcinoma (HCC). Treatment responses of 130 patients who underwent TACE were evaluated by CEUS and CECT. We initially compared the abilities of CEUS and CECT to detect residual tumour, which were confirmed by histology or angiography. Then, we compared the tumour response to TACE assessed by CEUS and CECT, according to Modified Response Evaluation Criteria in Solid Tumours (mRECIST). The sensitivity and accuracy of detecting residual tumour by CEUS vs. CECT were 95.9 % vs. 76.2 % (p < 0.001) and 96.2 % vs. 77.7 % (p < 0.001), respectively. For target lesions, 13 patients were observed as complete response (CR) by CEUS, compared to 36 by CECT (p < 0.001). For nontarget lesions, 12 patients were observed as CR by CEUS, compared to 22 by CECT (p = 0.006). For overall response, eight patients were observed as CR by CEUS, compared to 31 by CECT (p < 0.001). The diagnostic performance of CEUS was superior to CECT for detecting residual tumour after TACE. In clinical, CEUS should be recommended as an optional procedure for assessing the tumour response to TACE. (orig.)

  20. Texture analysis of advanced non-small cell lung cancer (NSCLC) on contrast-enhanced computed tomography: prediction of the response to the first-line chemotherapy

    International Nuclear Information System (INIS)

    Farina, Davide; Morassi, Mauro; Maroldi, Roberto; Roca, Elisa; Tassi, Gianfranco; Cavalleri, Giuseppe

    2013-01-01

    To assess whether tumour heterogeneity, quantified by texture analysis (TA) on contrast-enhanced computed tomography (CECT), can predict response to chemotherapy in advanced non-small cell lung cancer (NSCLC). Fifty-three CECT studies of patients with advanced NSCLC who had undergone first-line chemotherapy were retrospectively reviewed. Response to chemotherapy was evaluated according to RECIST1.1. Tumour uniformity was assessed by a TA method based on Laplacian of Gaussian filtering. The resulting parameters were correlated with treatment response and overall survival by multivariate analysis. Thirty-one out of 53 patients were non-responders and 22 were responders. Average overall survival was 13 months (4-35), minimum follow-up was 12 months. In the adenocarcinoma group (n = 31), the product of tumour uniformity and grey level (GL*U) was the unique independent variable correlating with treatment response. Dividing the GL*U (range 8.5-46.6) into tertiles, lesions belonging to the second and the third tertiles had an 8.3-fold higher probability of treatment response compared with those in the first tertile. No association between texture features and response to treatment was observed in the non-adenocarcinoma group (n = 22). GL*U did not correlate with overall survival. TA on CECT images in advanced lung adenocarcinoma provides an independent predictive indicator of response to first-line chemotherapy. (orig.)

  1. Computational Systems Toxicology: recapitulating the logistical dynamics of cellular response networks in virtual tissue models (Eurotox_2017)

    Science.gov (United States)

    Translating in vitro data and biological information into a predictive model for human toxicity poses a significant challenge. This is especially true for complex adaptive systems such as the embryo where cellular dynamics are precisely orchestrated in space and time. Computer ce...

  2. Collecting Sensitive Self-Report Data with Laptop Computers: Impact on the Response Tendencies of Adolescents in a Home Interview.

    Science.gov (United States)

    Supple, Andrew J.; Aquilino, William S.; Wright, Debra L.

    1999-01-01

    Explored effects of computerized, self-administered data collection techniques in research on adolescents' self-reported substance use and psychological well-being. Adolescents completing sensitive questions on only laptop computers reported higher levels of substance use and indicated higher levels of depression and irritability; they perceived…

  3. The opponent matters: elevated FMRI reward responses to winning against a human versus a computer opponent during interactive video game playing.

    Science.gov (United States)

    Kätsyri, Jari; Hari, Riitta; Ravaja, Niklas; Nummenmaa, Lauri

    2013-12-01

    Winning against an opponent in a competitive video game can be expected to be more rewarding than losing, especially when the opponent is a fellow human player rather than a computer. We show that winning versus losing in a first-person video game activates the brain's reward circuit and the ventromedial prefrontal cortex (vmPFC) differently depending on the type of the opponent. Participants played a competitive tank shooter game against alleged human and computer opponents while their brain activity was measured with functional magnetic resonance imaging. Brain responses to wins and losses were contrasted by fitting an event-related model to the hemodynamic data. Stronger activation to winning was observed in ventral and dorsal striatum as well as in vmPFC. Activation in ventral striatum was associated with participants' self-ratings of pleasure. During winning, ventral striatum showed stronger functional coupling with right insula, and weaker coupling with dorsal striatum, sensorimotor pre- and postcentral gyri, and visual association cortices. The vmPFC and dorsal striatum responses were stronger to winning when the subject was playing against a human rather than a computer. These results highlight the importance of social context in the neural encoding of reward value.

  4. Elaboration of a computer code for the solution of a two-dimensional two-energy group diffusion problem using the matrix response method

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.

    1980-12-01

    An analytical procedure to solve the neutron diffusion equation in two dimensions and two energy groups was developed. The response matrix method was used coupled with an expansion of the neutron flux in finite Fourier series. A computer code 'MRF2D' was elaborated to implement the above mentioned procedure for PWR reactor core calculations. Different core symmetry options are allowed by the code, which is also flexible enough to allow for improvements by means of algorithm optimization. The code performance was compared with a corner mesh finite difference code named TVEDIM by using a International Atomic Energy Agency (IAEA) standard problem. Computer processing time 12,7% smaller is required by the MRF2D code to reach the same precision on criticality eigenvalue. (Author) [pt

  5. Step responses of a torsional system with multiple clearances: Study of vibro-impact phenomenon using experimental and computational methods

    Science.gov (United States)

    Oruganti, Pradeep Sharma; Krak, Michael D.; Singh, Rajendra

    2018-01-01

    Recently Krak and Singh (2017) proposed a scientific experiment that examined vibro-impacts in a torsional system under a step down excitation and provided preliminary measurements and limited non-linear model studies. A major goal of this article is to extend the prior work with a focus on the examination of vibro-impact phenomena observed under step responses in a torsional system with one, two or three controlled clearances. First, new measurements are made at several locations with a higher sampling frequency. Measured angular accelerations are examined in both time and time-frequency domains. Minimal order non-linear models of the experiment are successfully constructed, using piecewise linear stiffness and Coulomb friction elements; eight cases of the generic system are examined though only three are experimentally studied. Measured and predicted responses for single and dual clearance configurations exhibit double sided impacts and time varying periods suggest softening trends under the step down torque. Non-linear models are experimentally validated by comparing results with new measurements and with those previously reported. Several metrics are utilized to quantify and compare the measured and predicted responses (including peak to peak accelerations). Eigensolutions and step responses of the corresponding linearized models are utilized to better understand the nature of the non-linear dynamic system. Finally, the effect of step amplitude on the non-linear responses is examined for several configurations, and hardening trends are observed in the torsional system with three clearances.

  6. Computer-Mediated Communication in Intimate Relationships: Associations of Boundary Crossing, Intrusion, Relationship Satisfaction, and Partner Responsiveness.

    Science.gov (United States)

    Norton, Aaron M; Baptist, Joyce; Hogan, Bernie

    2018-01-01

    This study examined the impact of technology on couples in committed relationships through the lens of the couple and technology framework. Specifically, we used data from 6,756 European couples to examine associations between online boundary crossing, online intrusion, relationship satisfaction, and partner responsiveness. The results suggest that participants' reports of online boundary crossing were linked with lower relationship satisfaction and partner responsiveness. Also, lower relationship satisfaction and partner responsiveness were associated with increased online boundary crossing. The results suggest that men, but not women, who reported greater acceptability for online boundary crossing were more likely to have partners who reported lower relationship satisfaction in their relationships. Implications for clinicians, relationship educators, and researchers are discussed. © 2017 American Association for Marriage and Family Therapy.

  7. Computer Adaptive Practice of Maths Ability Using a New Item Response Model for on the Fly Ability and Difficulty Estimation

    Science.gov (United States)

    Klinkenberg, S.; Straatemeier, M.; van der Maas, H. L. J.

    2011-01-01

    In this paper we present a model for computerized adaptive practice and monitoring. This model is used in the Maths Garden, a web-based monitoring system, which includes a challenging web environment for children to practice arithmetic. Using a new item response model based on the Elo (1978) rating system and an explicit scoring rule, estimates of…

  8. Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation

    NARCIS (Netherlands)

    Klinkenberg, S.; Straatemeier, M.; van der Maas, H.L.J.

    2011-01-01

    In this paper we present a model for computerized adaptive practice and monitoring. This model is used in the Maths Garden, a web-based monitoring system, which includes a challenging web environment for children to practice arithmetic. Using a new item response model based on the Elo (1978) rating

  9. Numerical Differentiation Methods for Computing Error Covariance Matrices in Item Response Theory Modeling: An Evaluation and a New Proposal

    Science.gov (United States)

    Tian, Wei; Cai, Li; Thissen, David; Xin, Tao

    2013-01-01

    In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…

  10. Semiquantitative prediction of early response of conventional transcatheter arterial chemoembolization for hepatocellular carcinoma using postprocedural plain cone-beam computed tomography.

    Science.gov (United States)

    Minami, Yasunori; Takita, Masahiro; Tsurusaki, Masakatsu; Yagyu, Yukinobu; Ueshima, Kazuomi; Murakami, Takamichi; Kudo, Masatoshi

    2017-03-01

    To investigate whether plain cone-beam computed tomography (CT) immediately after conventional transcatheter arterial chemoembolization (c-TACE) can help to predict tumor response semiquantitatively in patients with hepatocellular carcinoma (HCC). Analysis was carried out retrospectively on 262 targeted HCCs in 169 patients treated with c-TACE. Dynamic CT was performed at baseline and 1-4 months after c-TACE. Receiver-operating characteristic curve analysis was undertaken to evaluate whether voxel values of cone-beam CT could predict a complete response and to identify the cut-off value. Final tumor response assessment and early prediction using the retention pattern of iodized oil, the cut-off value of the density, and the combination of the cut-off density value and retention pattern of iodized oil in HCCs on postprocedural cone-beam CT were compared. Complete response was obtained in 72.9% of lesions. According to the pattern of iodized oil uptake, the sensitivity, specificity, and accuracy for predicting complete response were 85.9%, 70.4%, and 81.7%, respectively by excellent uptake on cone-beam CT. The area under the curve was 0.86 with the optimal cut-off at a voxel value of 200.13. According to not only the density but also the homogeneity of iodized oil retention, the sensitivity, specificity, and accuracy values for predicting complete response were 86.4%, 95.8%, and 88.9%, respectively. The predictive accuracy was significantly better than that of the pattern of iodized oil retention only (P = 0.019). The combination of density and visual estimate of homogeneity is superior to either alone in predicting tumor response of c-TACE in HCC patients. © 2016 The Japan Society of Hepatology.

  11. Simplified response monitoring criteria for multiple myeloma in patients undergoing therapy with novel agents using computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Schabel, Christoph; Horger, Marius; Kum, Sara [Department of Diagnostic and Interventional Radiology, Eberhard-Karls-University Tuebingen, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Weisel, Katja [Department of Internal Medicine II – Hematology & Oncology, Eberhard-Karls-University Tuebingen, Otfried-Müller-Str. 5, 72076 Tuebingen (Germany); Fritz, Jan [Russell H. Morgan Department of Radiology and Radiological Science, The Johns Hopkins University School of Medicine, 600 N Wolfe St., Baltimore, MD 21287 (United States); Ioanoviciu, Sorin D. [Department of Internal Medicine, Clinical Municipal Hospital Timisoara, Gheorghe Dima Str. 5, 300079 Timisoara (Romania); Bier, Georg, E-mail: georg.bier@med.uni-tuebingen.de [Department of Neuroradiology, Eberhard-Karls-University Tuebingen, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2016-12-15

    Highlights: • A simplified method for response monitoring of multiple myeloma is proposed. • Medullary bone lesions of all limbs were included and analysed. • Diameters of ≥2 medullary bone lesions are sufficient for therapy monitoring. - Abstract: Introduction: Multiple myeloma is a malignant hematological disorder of the mature B-cell lymphocytes originating in the bone marrow. While therapy monitoring is still mainly based on laboratory biomarkers, the additional use of imaging has been advocated due to inaccuracies of serological biomarkers or in a-secretory myelomas. Non-enhanced CT and MRI have similar sensitivities for lesions in yellow marrow-rich bone marrow cavities with a favourable risk and cost-effectiveness profile of CT. Nevertheless, these methods are still limited by frequently high numbers of medullary lesions and its time consumption for proper evaluation. Objective: To establish simplified response criteria by correlating size and CT attenuation changes of medullary multiple myeloma lesions in the appendicular skeleton with the course of lytic bone lesions in the entire skeleton. Furthermore to evaluate these criteria with respect to established hematological myeloma-specific parameters for the prediction of treatment response to bortezomib or lenalidomide. Materials and methods: Non-enhanced reduced-dose whole-body CT examinations of 78 consecutive patients (43 male, 35 female, mean age 63.69 ± 9.2 years) with stage III multiple myeloma were retrospectively re-evaluated. On per patient basis, size and mean CT attenuation of 2–4 representative lesions in the limbs were measured at baseline and at a follow-up after a mean of 8 months. Results were compared with the course of lytical bone lesions as well with that of specific hematological biomarkers. Myeloma response was assessed according to the International Myeloma Working Group (IMWG) uniform response criteria. Testing for correlation between response of medullary lesions (Resp

  12. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  13. Analytical Harmonic Vibrational Frequencies for the Green Fluorescent Protein Computed with ONIOM: Chromophore Mode Character and Its Response to Environment.

    Science.gov (United States)

    Thompson, Lee M; Lasoroski, Aurélie; Champion, Paul M; Sage, J Timothy; Frisch, Michael J; van Thor, Jasper J; Bearpark, Michael J

    2014-02-11

    A systematic comparison of different environmental effects on the vibrational modes of the 4-hydroxybenzylidene-2,3-dimethylimidazolinone (HBDI) chromophore using the ONIOM method allows us to model how the molecule's spectroscopic transitions are modified in the Green Fluorescent Protein (GFP). ONIOM(QM:MM) reduces the expense of normal mode calculations when computing the majority of second derivatives only at the MM level. New developments described here for the efficient solution of the CPHF equations, including contributions from electrostatic interactions with environment charges, mean that QM model systems of ∼100 atoms can be embedded within a much larger MM environment of ∼5000 atoms. The resulting vibrational normal modes, their associated frequencies, and dipole derivative vectors have been used to interpret experimental difference spectra (GFPI2-GFPA), chromophore vibrational Stark shifts, and changes in the difference between electronic and vibrational transition dipoles (mode angles) in the protein environment.

  14. New advances in the forced response computation of periodic structures using the wave finite element (WFE) method

    OpenAIRE

    Mencik , Jean-Mathieu

    2014-01-01

    International audience; The wave finite element (WFE) method is investigated to describe the harmonic forced response of onedimensional periodic structures like those composed of complex substructures and encountered in engineering applications. The dynamic behavior of these periodic structures is analyzed over wide frequency bands where complex spatial dynamics, inside the substructures, are likely to occur.Within theWFE framework, the dynamic behavior of periodic structures is described in ...

  15. Size determination and response assessment of liver metastases with computed tomography—Comparison of RECIST and volumetric algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Rothe, Jan Holger, E-mail: jan-holger.rothe@charite.de [Klinik für Radiologie, Campus Virchow-Klinikum, Charité – Universitätsmedizin, Berlin (Germany); Grieser, Christian [Klinik für Radiologie, Campus Virchow-Klinikum, Charité – Universitätsmedizin, Berlin (Germany); Lehmkuhl, Lukas [Abteilung für Diagnostische und Interventionelle Radiologie, Herzzentrum Leipzig (Germany); Schnapauff, Dirk; Fernandez, Carmen Perez; Maurer, Martin H.; Mussler, Axel; Hamm, Bernd; Denecke, Timm; Steffen, Ingo G. [Klinik für Radiologie, Campus Virchow-Klinikum, Charité – Universitätsmedizin, Berlin (Germany)

    2013-11-01

    Objective: To compare different three-dimensional volumetric algorithms (3D-algorithms) and RECIST for size measurement and response assessment in liver metastases from colorectal and pancreatic cancer. Methods: The volumes of a total of 102 liver metastases in 45 patients (pancreatic cancer, n = 22; colon cancer, n = 23) were estimated using three volumetric methods (seeded region growing method, slice-based segmentation, threshold-based segmentation) and the RECIST 1.1 method with volume calculation based on the largest axial diameter. Each measurement was performed three times by one observer. All four methods were applied to follow-up on 55 liver metastases in 29 patients undergoing systemic treatment (median follow-up, 3.5 months; range, 1–10 months). Analysis of variance (ANOVA) with post hoc tests was performed to analyze intraobserver variability and intermethod differences. Results: ANOVA showed significant higher volumes calculated according to the RECIST guideline compared to the other measurement methods (p < 0.001) with relative differences ranging from 0.4% to 41.1%. Intraobserver variability was significantly higher (p < 0.001) for RECIST and threshold based segmentation (3.6–32.8%) compared with slice segmentation (0.4–13.7%) and seeded region growing method (0.6–10.8%). In the follow-up study, the 3D-algorithms and the assessment following RECIST 1.1 showed a discordant classification of treatment response in 10–21% of the patients. Conclusions: This study supports the use of volumetric measurement methods due to significant higher intraobserver reproducibility compared to RECIST. Substantial discrepancies in tumor response classification between RECIST and volumetric methods depending on applied thresholds confirm the requirement of a consensus concerning volumetric criteria for response assessment.

  16. Genetic and Computational Approaches for Studying Plant Development and Abiotic Stress Responses Using Image-Based Phenotyping

    Science.gov (United States)

    Campbell, M. T.; Walia, H.; Grondin, A.; Knecht, A.

    2017-12-01

    The development of abiotic stress tolerant crops (i.e. drought, salinity, or heat stress) requires the discovery of DNA sequence variants associated with stress tolerance-related traits. However, many traits underlying adaptation to abiotic stress involve a suite of physiological pathways that may be induced at different times throughout the duration of stress. Conventional single-point phenotyping approaches fail to fully capture these temporal responses, and thus downstream genetic analysis may only identify a subset of the genetic variants that are important for adaptation to sub-optimal environments. Although genomic resources for crops have advanced tremendously, the collection of phenotypic data for morphological and physiological traits is laborious and remains a significant bottleneck in bridging the phenotype-genotype gap. In recent years, the availability of automated, image-based phenotyping platforms has provided researchers with an opportunity to collect morphological and physiological traits non-destructively in a highly controlled environment. Moreover, these platforms allow abiotic stress responses to be recorded throughout the duration of the experiment, and have facilitated the use of function-valued traits for genetic analyses in major crops. We will present our approaches for addressing abiotic stress tolerance in cereals. This talk will focus on novel open-source software to process and extract biological meaningful data from images generated from these phenomics platforms. In addition, we will discuss the statistical approaches to model longitudinal phenotypes and dissect the genetic basis of dynamic responses to these abiotic stresses throughout development.

  17. A Computational approach in optimizing process parameters of GTAW for SA 106 Grade B steel pipes using Response surface methodology

    Science.gov (United States)

    Sumesh, A.; Sai Ramnadh, L. V.; Manish, P.; Harnath, V.; Lakshman, V.

    2016-09-01

    Welding is one of the most common metal joining techniques used in industry for decades. As in the global manufacturing scenario the products should be more cost effective. Therefore the selection of right process with optimal parameters will help the industry in minimizing their cost of production. SA 106 Grade B steel has a wide application in Automobile chassis structure, Boiler tubes and pressure vessels industries. Employing central composite design the process parameters for Gas Tungsten Arc Welding was optimized. The input parameters chosen were weld current, peak current and frequency. The joint tensile strength was the response considered in this study. Analysis of variance was performed to determine the statistical significance of the parameters and a Regression analysis was performed to determine the effect of input parameters over the response. From the experiment the maximum tensile strength obtained was 95 KN reported for a weld current of 95 Amp, frequency of 50 Hz and peak current of 100 Amp. With an aim of maximizing the joint strength using Response optimizer a target value of 100 KN is selected and regression models were optimized. The output results are achievable with a Weld current of 62.6148 Amp, Frequency of 23.1821 Hz, and Peak current of 65.9104 Amp. Using Die penetration test the weld joints were also classified in to 2 categories as good weld and weld with defect. This will also help in getting a defect free joint when welding is performed using GTAW process.

  18. A second-generation computational modeling of cardiac electrophysiology: response of action potential to ionic concentration changes and metabolic inhibition.

    Science.gov (United States)

    Alaa, Nour Eddine; Lefraich, Hamid; El Malki, Imane

    2014-10-21

    Cardiac arrhythmias are becoming one of the major health care problem in the world, causing numerous serious disease conditions including stroke and sudden cardiac death. Furthermore, cardiac arrhythmias are intimately related to the signaling ability of cardiac cells, and are caused by signaling defects. Consequently, modeling the electrical activity of the heart, and the complex signaling models that subtend dangerous arrhythmias such as tachycardia and fibrillation, necessitates a quantitative model of action potential (AP) propagation. Yet, many electrophysiological models, which accurately reproduce dynamical characteristic of the action potential in cells, have been introduced. However, these models are very complex and are very time consuming computationally. Consequently, a large amount of research is consecrated to design models with less computational complexity. This paper is presenting a new model for analyzing the propagation of ionic concentrations and electrical potential in space and time. In this model, the transport of ions is governed by Nernst-Planck flux equation (NP), and the electrical interaction of the species is described by a new cable equation. These set of equations form a system of coupled partial nonlinear differential equations that is solved numerically. In the first we describe the mathematical model. To realize the numerical simulation of our model, we proceed by a finite element discretization and then we choose an appropriate resolution algorithm. We give numerical simulations obtained for different input scenarios in the case of suicide substrate reaction which were compared to those obtained in literature. These input scenarios have been chosen so as to provide an intuitive understanding of dynamics of the model. By accessing time and space domains, it is shown that interpreting the electrical potential of cell membrane at steady state is incorrect. This model is general and applies to ions of any charge in space and time

  19. Quantified visual scoring of metastatic melanoma patient treatment response using computed tomography: improving on the current standard.

    Science.gov (United States)

    Gottlieb, Ronald H; Krupinski, Elizabeth; Chalasani, Pavani; Cranmer, Lee

    2012-04-01

    To assess whether quantitative visual scoring (QVS) is a better early predictor of progression-free survival (PFS) in patients on chemotherapy for metastatic melanoma using CT than the currently used Response Evaluation Criteria in Solid Tumors (RECIST) standard. Retrospective evaluation of 65 consecutive patients with metastatic melanoma on treatment who had a baseline and follow-up CT after two cycles of therapy. QVS was used to code imaging findings on the radiology reports considering size change, brain metastases, new lesions, mixed lesion response, and the number of organ systems involved. RECIST 1.1 criteria placed patients in the progressive disease, stable disease, or partial response groups. Multiple regression analysis was used to correlate the various independent variables with PFS. The Cox hazard proportions ratio, median survival, and Kaplan-Meier curves of the different prognostic groups were calculated. QVS of size change was found more sensitive in detecting patients deteriorating (57.1% versus 37.5%) or improving (23.8% versus 10.7%), more correlated with the median PFS for the deteriorating (1.8 versus 1.7 months), stable (5.6 versus 4.0 month), and improving (8.3 versus 5.5 months) categories and more predictive of PFS (Cox hazard proportion ratio of 3.070 versus 1.860) than RECIST 1.1 categorization. Multiple regression analysis demonstrated QVS of lesion size correlated most closely with PFS among the variables assessed (r = 0.519, p metastatic melanoma patients likely to have longer PFS.

  20. A Large-Scale Computational Analysis of Corneal Structural Response and Ectasia Risk in Myopic Laser Refractive Surgery.

    Science.gov (United States)

    Dupps, William Joseph; Seven, Ibrahim

    2016-08-01

    To investigate biomechanical strain as a structural susceptibility metric for corneal ectasia in a large-scale computational trial. A finite element modeling study was performed using retrospective Scheimpflug tomography data from 40 eyes of 40 patients. LASIK and PRK were simulated with varied myopic ablation profiles and flap thickness parameters across eyes from LASIK candidates, patients disqualified for LASIK, subjects with atypical topography, and keratoconus subjects in 280 simulations. Finite element analysis output was then interrogated to extract several risk and outcome variables. We tested the hypothesis that strain is greater in known at-risk eyes than in normal eyes, evaluated the ability of a candidate strain variable to differentiate eyes that were empirically disqualified as LASIK candidates, and compared the performance of common risk variables as predictors of this novel susceptibility marker across multiple virtual subjects and surgeries. A candidate susceptibility metric that expressed mean strains across the anterior residual stromal bed was significantly higher in eyes with confirmed ectatic predisposition in preoperative and all postoperative cases (P≤.003). The strain metric was effective at differentiating normal and at-risk eyes (area under receiver operating characteristic curve ≥ 0.83, P≤.002), was highly correlated to thickness-based risk metrics (as high as R(2) = 95%, Pectasia risk and provides a novel biomechanical construct for expressing structural risk in refractive surgery. Mechanical strain is an effective marker of known ectasia risk and correlates to predicted refractive error after myopic photoablative surgery.

  1. Structural biology response of a collagen hydrogel synthetic extracellular matrix with embedded human fibroblast: computational and experimental analysis.

    Science.gov (United States)

    Manzano, Sara; Moreno-Loshuertos, Raquel; Doblaré, Manuel; Ochoa, Ignacio; Hamdy Doweidar, Mohamed

    2015-08-01

    Adherent cells exert contractile forces which play an important role in the spatial organization of the extracellular matrix (ECM). Due to these forces, the substrate experiments a volume reduction leading to a characteristic shape. ECM contraction is a key process in many biological processes such as embryogenesis, morphogenesis and wound healing. However, little is known about the specific parameters that control this process. With this aim, we present a 3D computational model able to predict the contraction process of a hydrogel matrix due to cell-substrate mechanical interaction. It considers cell-generated forces, substrate deformation, ECM density, cellular migration and proliferation. The model also predicts the cellular spatial distribution and concentration needed to reproduce the contraction process and confirms the minimum value of cellular concentration necessary to initiate the process observed experimentally. The obtained continuum formulation has been implemented in a finite element framework. In parallel, in vitro experiments have been performed to obtain the main model parameters and to validate it. The results demonstrate that cellular forces, migration and proliferation are acting simultaneously to display the ECM contraction.

  2. How do trees grow? Response from the graphical and quantitative analyses of computed tomography scanning data collected on stem sections.

    Science.gov (United States)

    Dutilleul, Pierre; Han, Li Wen; Beaulieu, Jean

    2014-06-01

    Tree growth, as measured via the width of annual rings, is used for environmental impact assessment and climate back-forecasting. This fascinating natural process has been studied at various scales in the stem (from cell and fiber within a growth ring, to ring and entire stem) in one, two, and three dimensions. A new approach is presented to study tree growth in 3D from stem sections, at a scale sufficiently small to allow the delineation of reliable limits for annual rings and large enough to capture directional variation in growth rates. The technology applied is computed tomography scanning, which provides - for one stem section - millions of data (indirect measures of wood density) that can be mapped, together with a companion measure of dispersion and growth ring limits in filigree. Graphical and quantitative analyses are reported for white spruce trees with circular vs non-circular growth. Implications for dendroclimatological research are discussed. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  3. Quantitative Assessment of the Physiological Parameters Influencing QT Interval Response to Medication: Application of Computational Intelligence Tools.

    Science.gov (United States)

    Polak, Sebastian; Wiśniowska, Barbara; Mendyk, Aleksander; Pacławski, Adam; Szlęk, Jakub

    2018-01-01

    Human heart electrophysiology is complex biological phenomenon, which is indirectly assessed by the measured ECG signal. ECG trace is further analyzed to derive interpretable surrogates including QT interval, QRS complex, PR interval, and T wave morphology. QT interval and its modification are the most commonly used surrogates of the drug triggered arrhythmia, but it is known that the QT interval itself is determined by other nondrug related parameters, physiological and pathological. In the current study, we used the computational intelligence algorithms to analyze correlations between various simulated physiological parameters and QT interval. Terfenadine given concomitantly with 8 enzymatic inhibitors was used as an example. The equation developed with the use of genetic programming technique leads to general reasoning about the changes in the prolonged QT. For small changes of the QT interval, the drug-related IKr and ICa currents inhibition potentials have major impact. The physiological parameters such as body surface area, potassium, sodium, and calcium ions concentrations are negligible. The influence of the physiological variables increases gradually with the more pronounced changes in QT. As the significant QT prolongation is associated with the drugs triggered arrhythmia risk, analysis of the role of physiological parameters influencing ECG seems to be advisable.

  4. Predicting response to neoadjuvant chemotherapy in primary breast cancer using volumetric helical perfusion computed tomography: a preliminary study

    International Nuclear Information System (INIS)

    Li, Sonia P.; Makris, Andreas; Gogbashian, Andrew; Simcock, Ian C.; Stirling, J.J.; Goh, Vicky

    2012-01-01

    To investigate whether CT-derived vascular parameters in primary breast cancer predict complete pathological response (pCR) to neoadjuvant chemotherapy (NAC). Twenty prospective patients with primary breast cancer due for NAC underwent volumetric helical perfusion CT to derive whole tumour regional blood flow (BF), blood volume (BV) and flow extraction product (FE) by deconvolution analysis. A pCR was achieved if no residual invasive cancer was detectable on pathological examination. Relationships between baseline BF, BV, FE, tumour size and volume, and pCR were examined using the Mann-Whitney U test. Receiver operating characteristic (ROC) curve analysis was performed to assess the parameter best able to predict response. Intra- and inter-observer variability was assessed using Bland-Altman statistics. Seventeen out of 20 patients completed NAC with four achieving a pCR. Baseline BF and FE were higher in patients who achieved a pCR compared with those who did not (P = 0.032); tumour size and volume were not significantly different (P > 0.05). ROC analysis revealed that BF and FE were able to identify responders effectively (AUC = 0.87; P = 0.03). There was good intra- and inter-observer agreement. Primary breast cancers which exhibited higher levels of perfusion before treatment were more likely to achieve a pCR to NAC. (orig.)

  5. Predicting response to neoadjuvant chemotherapy in primary breast cancer using volumetric helical perfusion computed tomography: a preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Li, Sonia P.; Makris, Andreas [Academic Oncology Unit, Mount Vernon Cancer Centre, Middlesex (United Kingdom); Gogbashian, Andrew; Simcock, Ian C.; Stirling, J.J. [Paul Strickland Scanner Centre, Mount Vernon Cancer Centre, Middlesex (United Kingdom); Goh, Vicky [Paul Strickland Scanner Centre, Mount Vernon Cancer Centre, Middlesex (United Kingdom); Lambeth Wing, St Thomas' Hospital, Division of Imaging Sciences, Kings College London, London (United Kingdom)

    2012-09-15

    To investigate whether CT-derived vascular parameters in primary breast cancer predict complete pathological response (pCR) to neoadjuvant chemotherapy (NAC). Twenty prospective patients with primary breast cancer due for NAC underwent volumetric helical perfusion CT to derive whole tumour regional blood flow (BF), blood volume (BV) and flow extraction product (FE) by deconvolution analysis. A pCR was achieved if no residual invasive cancer was detectable on pathological examination. Relationships between baseline BF, BV, FE, tumour size and volume, and pCR were examined using the Mann-Whitney U test. Receiver operating characteristic (ROC) curve analysis was performed to assess the parameter best able to predict response. Intra- and inter-observer variability was assessed using Bland-Altman statistics. Seventeen out of 20 patients completed NAC with four achieving a pCR. Baseline BF and FE were higher in patients who achieved a pCR compared with those who did not (P = 0.032); tumour size and volume were not significantly different (P > 0.05). ROC analysis revealed that BF and FE were able to identify responders effectively (AUC = 0.87; P = 0.03). There was good intra- and inter-observer agreement. Primary breast cancers which exhibited higher levels of perfusion before treatment were more likely to achieve a pCR to NAC. (orig.)

  6. X-Ray Computed Tomography: Semiautomated Volumetric Analysis of Late-Stage Lung Tumors as a Basis for Response Assessments

    Science.gov (United States)

    Bendtsen, C.; Kietzmann, M.; Korn, R.; Mozley, P. D.; Schmidt, G.; Binnig, G.

    2011-01-01

    Background. This study presents a semiautomated approach for volumetric analysis of lung tumors and evaluates the feasibility of using volumes as an alternative to line lengths as a basis for response evaluation criteria in solid tumors (RECIST). The overall goal for the implementation was to accurately, precisely, and efficiently enable the analyses of lesions in the lung under the guidance of an operator. Methods. An anthropomorphic phantom with embedded model masses and 71 time points in 10 clinical cases with advanced lung cancer was analyzed using a semi-automated workflow. The implementation was done using the Cognition Network Technology. Results. Analysis of the phantom showed an average accuracy of 97%. The analyses of the clinical cases showed both intra- and interreader variabilities of approximately 5% on average with an upper 95% confidence interval of 14% and 19%, respectively. Compared to line lengths, the use of volumes clearly shows enhanced sensitivity with respect to determining response to therapy. Conclusions. It is feasible to perform volumetric analysis efficiently with high accuracy and low variability, even in patients with late-stage cancer who have complex lesions. PMID:21747819

  7. X-Ray Computed Tomography: Semiautomated Volumetric Analysis of Late-Stage Lung Tumors as a Basis for Response Assessments

    Directory of Open Access Journals (Sweden)

    C. Bendtsen

    2011-01-01

    Full Text Available Background. This study presents a semiautomated approach for volumetric analysis of lung tumors and evaluates the feasibility of using volumes as an alternative to line lengths as a basis for response evaluation criteria in solid tumors (RECIST. The overall goal for the implementation was to accurately, precisely, and efficiently enable the analyses of lesions in the lung under the guidance of an operator. Methods. An anthropomorphic phantom with embedded model masses and 71 time points in 10 clinical cases with advanced lung cancer was analyzed using a semi-automated workflow. The implementation was done using the Cognition Network Technology. Results. Analysis of the phantom showed an average accuracy of 97%. The analyses of the clinical cases showed both intra- and interreader variabilities of approximately 5% on average with an upper 95% confidence interval of 14% and 19%, respectively. Compared to line lengths, the use of volumes clearly shows enhanced sensitivity with respect to determining response to therapy. Conclusions. It is feasible to perform volumetric analysis efficiently with high accuracy and low variability, even in patients with late-stage cancer who have complex lesions.

  8. The electronic response of pristine, Al and Si doped BC2N nanotubes to a cathinone molecule: Computational study

    Science.gov (United States)

    Nejati, Kamellia; Vessally, Esmail; Delir Kheirollahi Nezhad, Parvaneh; Mofid, Hadi; Bekhradnia, Ahmadreza

    2017-12-01

    Cathinone (CT) is a psychoactive drug which its abuse is linked to several deaths worldwide. Here, we investigated the electronic response of BC2N nanotubes to the CT drug, using density functional theory calculations. Our results indicate that the CT drug is adsorbed on the pristine tube from its -NH2 group with ad adsorption energy about -14.6 kcal/mol with no electronic response. To overcome this problem, we doped the tube with Al or Si atom. Both of the Al and Si dopants increase the tube sensitivity and strengthen the interaction. Our calculations demonstrate that despite the high sensitivity of the Al-doped BC2N nanotube to the CT drug, it suffers from a very long recovery time which makes it unsuitable for application in CT sensors. But the calculated recovery time for the Si-doped BC2N nanotube is predicted to be about 0.27 s, which is short and desirable. Also, we showed that the Si-doped tube can be used in the humidity condition and at the presence of some gases including H2, O2, N2, and CO2. It was concluded that Si-doped BC2N nanotubes may be promising candidate for application in the CT sensors which benefit form a short recovery time, high sensitivity, and selectivity.

  9. Viable tumor volume: Volume of interest within segmented metastatic lesions, a pilot study of proposed computed tomography response criteria for urothelial cancer

    Energy Technology Data Exchange (ETDEWEB)

    Folio, Les Roger, E-mail: Les.folio@nih.gov [Lead Radiologist for CT, NIH Radiology and Imaging Sciences, 10 Center Drive, Bethesda, MD 20892 (United States); Turkbey, Evrim B., E-mail: evrimbengi@yahoo.com [Johns Hopkins University, Baltimore, MD 21218 (United States); Steinberg, Seth M., E-mail: steinbes@mail.nih.gov [Head, Biostatistics and Data Management Section, Office of the Clinical Director, Center for Cancer Research, National Cancer Institute, 9609 Medical Center Drive, Room 2W334, MSC 9716, Bethesda, MD 20892 (United States); Apolo, Andrea B. [Genitourinary Malignancies Branch, National Cancer Institute, National Institutes of Health, 10 Center Drive, Bethesda, MD 20892 (United States)

    2015-09-15

    Highlights: • It is clear that 2D axial measurements are incomplete assessments in metastatic disease; especially in light of evolving antiangiogenic therapies that can result in tumor necrosis. • Our pilot study demonstrates that taking volumetric density into account can better predict overall survival when compared to RECIST, volumetric size, MASS and Choi. • Although volumetric segmentation and further density analysis may not yet be feasible within routine workflows, the authors believe that technology advances may soon make this possible. - Abstract: Objectives: To evaluate the ability of new computed tomography (CT) response criteria for solid tumors such as urothelial cancer (VTV; viable tumor volume) to predict overall survival (OS) in patients with metastatic bladder cancer treated with cabozantinib. Materials and methods: We compared the relative capabilities of VTV, RECIST, MASS (morphology, attenuation, size, and structure), and Choi criteria, as well as volume measurements, to predict OS using serial follow-up contrast-enhanced CT exams in patients with metastatic urothelial carcinoma. Kaplan–Meier curves and 2-tailed log-rank tests compared OS based on early RECIST 1.1 response against each of the other criteria. A Cox proportional hazards model assessed response at follow-up exams as a time-varying covariate for OS. Results: We assessed 141 lesions in 55CT scans from 17 patients with urothelial metastasis, comparing VTV, RECIST, MASS, and Choi criteria, and volumetric measurements, for response assessment. Median follow-up was 4.5 months, range was 2–14 months. Only the VTV criteria demonstrated a statistical association with OS (p = 0.019; median OS 9.7 vs. 3.5 months). Conclusion: This pilot study suggests that VTV is a promising tool for assessing tumor response and predicting OS, using criteria that incorporate tumor volume and density in patients receiving antiangiogenic therapy for urothelial cancer. Larger studies are warranted to

  10. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings.

    Science.gov (United States)

    Revell, A D; Wang, D; Wood, R; Morrow, C; Tempelman, H; Hamers, R L; Alvarez-Uria, G; Streinu-Cercel, A; Ene, L; Wensing, A M J; DeWolf, F; Nelson, M; Montaner, J S; Lane, H C; Larder, B A

    2013-06-01

    Genotypic HIV drug-resistance testing is typically 60%-65% predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART without a genotype and evaluated their potential as a treatment support tool in RLSs. Random forest models were trained to predict the probability of response to ART (≤400 copies HIV RNA/mL) using the following data from 14 891 treatment change episodes (TCEs) after virological failure, from well-resourced countries: viral load and CD4 count prior to treatment change, treatment history, drugs in the new regimen, time to follow-up and follow-up viral load. Models were assessed by cross-validation during development, with an independent set of 800 cases from well-resourced countries, plus 231 cases from Southern Africa, 206 from India and 375 from Romania. The area under the receiver operating characteristic curve (AUC) was the main outcome measure. The models achieved an AUC of 0.74-0.81 during cross-validation and 0.76-0.77 with the 800 test TCEs. They achieved AUCs of 0.58-0.65 (Southern Africa), 0.63 (India) and 0.70 (Romania). Models were more accurate for data from the well-resourced countries than for cases from Southern Africa and India (P < 0.001), but not Romania. The models identified alternative, available drug regimens predicted to result in virological response for 94% of virological failures in Southern Africa, 99% of those in India and 93% of those in Romania. We developed computational models that predict virological response to ART without a genotype with comparable accuracy to genotyping with rule-based interpretation. These models have the potential to help optimize antiretroviral therapy for patients in RLSs where genotyping is not generally available.

  11. Computed-tomography-based finite-element models of long bones can accurately capture strain response to bending and torsion.

    Science.gov (United States)

    Varghese, Bino; Short, David; Penmetsa, Ravi; Goswami, Tarun; Hangartner, Thomas

    2011-04-29

    Finite element (FE) models of long bones constructed from computed-tomography (CT) data are emerging as an invaluable tool in the field of bone biomechanics. However, the performance of such FE models is highly dependent on the accurate capture of geometry and appropriate assignment of material properties. In this study, a combined numerical-experimental study is performed comparing FE-predicted surface strains with strain-gauge measurements. Thirty-six major, cadaveric, long bones (humerus, radius, femur and tibia), which cover a wide range of bone sizes, were tested under three-point bending and torsion. The FE models were constructed from trans-axial volumetric CT scans, and the segmented bone images were corrected for partial-volume effects. The material properties (Young's modulus for cortex, density-modulus relationship for trabecular bone and Poisson's ratio) were calibrated by minimizing the error between experiments and simulations among all bones. The R(2) values of the measured strains versus load under three-point bending and torsion were 0.96-0.99 and 0.61-0.99, respectively, for all bones in our dataset. The errors of the calculated FE strains in comparison to those measured using strain gauges in the mechanical tests ranged from -6% to 7% under bending and from -37% to 19% under torsion. The observation of comparatively low errors and high correlations between the FE-predicted strains and the experimental strains, across the various types of bones and loading conditions (bending and torsion), validates our approach to bone segmentation and our choice of material properties. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Impact of multidetector computed tomography on the diagnosis and treatment of patients with systemic inflammatory response syndrome or sepsis

    International Nuclear Information System (INIS)

    Schleder, S.; Luerken, L.; Dendl, L.M.; Stroszczynski, C.; Schreyer, A.G.; Redel, A.; Selgrad, M.; Renner, P.

    2017-01-01

    To evaluate the impact of CT scans on diagnosis or change of therapy in patients with systemic inflammatory response syndrome (SIRS) or sepsis and obscure clinical infection. CT records of patients with obscure clinical infection and SIRS or sepsis were retrospectively evaluated. Both confirmation of and changes in the diagnosis or therapy based on CT findings were analysed by means of the hospital information system and radiological information system. A sub-group analysis included differences with regard to anatomical region, medical history and referring department. Of 525 consecutive patients evaluated, 59% had been referred from internal medicine and 41% from surgery. CT examination had confirmed the suspected diagnosis in 26% and had resulted in a different diagnosis in 33% and a change of therapy in 32%. Abdominal scans yielded a significantly higher (p=0.013) change of therapy rate (42%) than thoracic scans (22%). Therapy was changed significantly more often (p=0.016) in surgical patients (38%) than in patients referred from internal medicine (28%). CT examination for detecting an unknown infection focus in patients with SIRS or sepsis is highly beneficial and should be conducted in patients with obscure clinical infection. (orig.)

  13. Impact of multidetector computed tomography on the diagnosis and treatment of patients with systemic inflammatory response syndrome or sepsis

    Energy Technology Data Exchange (ETDEWEB)

    Schleder, S.; Luerken, L.; Dendl, L.M.; Stroszczynski, C.; Schreyer, A.G. [University Medical Centre Regensburg, Department of Radiology, Regensburg (Germany); Redel, A. [University Medical Centre Regensburg, Department of Anaesthesiology, Regensburg (Germany); Selgrad, M. [University Medical Centre Regensburg, Department of Internal Medicine I, Regensburg (Germany); Renner, P. [University Medical Centre Regensburg, Department of Surgery, Regensburg (Germany)

    2017-11-15

    To evaluate the impact of CT scans on diagnosis or change of therapy in patients with systemic inflammatory response syndrome (SIRS) or sepsis and obscure clinical infection. CT records of patients with obscure clinical infection and SIRS or sepsis were retrospectively evaluated. Both confirmation of and changes in the diagnosis or therapy based on CT findings were analysed by means of the hospital information system and radiological information system. A sub-group analysis included differences with regard to anatomical region, medical history and referring department. Of 525 consecutive patients evaluated, 59% had been referred from internal medicine and 41% from surgery. CT examination had confirmed the suspected diagnosis in 26% and had resulted in a different diagnosis in 33% and a change of therapy in 32%. Abdominal scans yielded a significantly higher (p=0.013) change of therapy rate (42%) than thoracic scans (22%). Therapy was changed significantly more often (p=0.016) in surgical patients (38%) than in patients referred from internal medicine (28%). CT examination for detecting an unknown infection focus in patients with SIRS or sepsis is highly beneficial and should be conducted in patients with obscure clinical infection. (orig.)

  14. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  15. Do Interviewers' Health Beliefs and Habits Modify Responses to Sensitive Questions? A study using Data Collected from Pregnant women by Means of Computer-assisted Telephone Interviews

    DEFF Research Database (Denmark)

    Andersen, Anne-Marie Nybo; Olsen, Jørn

    2002-01-01

    of pregnancy were collected during the time period October 1, 1997-February 1, 1999. Overall, the authors found little evidence to suggest that interviewers' personal habits or attitudes toward smoking and alcohol consumption during pregnancy had consequences for the responses they obtained; neither did...... the interviewers' education, age, or parity correlate with the answers they obtained. In these data gathered through computer-assisted telephone interviews, interviewer effects arising from variations in interviewers' health beliefs and personal habits were found to be negligible. Thorough training......If interviewers' personal habits or attitudes influence respondents' answers to given questions, this may lead to bias, which should be taken into consideration when analyzing data. The authors examined a potential interviewer effect in a study of pregnant women in which exposure data were obtained...

  16. NUMEL: a computer aided design suite for the assessment of the steady state, static/dynamic stability and transient responses of nuclear steam generators

    International Nuclear Information System (INIS)

    Rowe, D.; Lightfoot, P.

    1988-02-01

    NUMEL is a computer aided design suite for the assessment of the steady state, static/dynamic stability and transient responses of nuclear steam generators. The equations solved are those of a monotube coflow or counterflow heat exchanger. The advantages of NUMEL are its fast execution speed, robustness, extensive validation and flexibility coupled with ease of use. The code can simultaneously model up to four separate sections (e.g. reheater, HP boiler). This document is a user manual and describes in detail the running of the NUMEL suite. In addition, a discussion is presented of the necessary approximations involved in representing a serpentine or helical AGR boiler as a monotube counterflow heat exchanger. To date, NUMEL has been applied to the modelling of AGR, Fast Reactor and once through Magnox and conventional boilers. Other versions of the code are available for specialist applications, e.g. Magnox and conventional recirculation boilers. (author)

  17. CONTEMPT-LT/028: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hargroves, D.W.; Metcalfe, L.J.; Wheat, L.L.; Niederauer, G.F.; Obenchain, C.F.

    1979-03-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. An annular fan model is also provided to model pressure control in the annular region of dual containment systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air--vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different

  18. Babcock and Wilcox revisions to CONTEMPT, computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hsii, Y.H.

    1975-01-01

    The CONTEMPT computer program predicts the pressure-temperature response of a single-volume reactor building to a loss-of-coolant accident. The analytical model used for the program is described. CONTEMPT assumes that the loss-of-coolant accident can be separated into two phases; the primary system blowdown and reactor building pressurization. The results of the blowdown analysis serve as the boundary conditions and are input to the CONTEMPT program. Thus, the containment model is only concerned with the pressure and temperature in the reactor building and the temperature distribution through the reactor building structures. The program also calculates building leakage and the effects of engineered safety features such as reactor building sprays, decay heat coolers, sump coolers, etc. 11 references. (U.S.)

  19. Non-linear least squares curve fitting of a simple theoretical model to radioimmunoassay dose-response data using a mini-computer

    International Nuclear Information System (INIS)

    Wilkins, T.A.; Chadney, D.C.; Bryant, J.; Palmstroem, S.H.; Winder, R.L.

    1977-01-01

    Using the simple univalent antigen univalent-antibody equilibrium model the dose-response curve of a radioimmunoassay (RIA) may be expressed as a function of Y, X and the four physical parameters of the idealised system. A compact but powerful mini-computer program has been written in BASIC for rapid iterative non-linear least squares curve fitting and dose interpolation with this function. In its simplest form the program can be operated in an 8K byte mini-computer. The program has been extensively tested with data from 10 different assay systems (RIA and CPBA) for measurement of drugs and hormones ranging in molecular size from thyroxine to insulin. For each assay system the results have been analysed in terms of (a) curve fitting biases and (b) direct comparison with manual fitting. In all cases the quality of fitting was remarkably good in spite of the fact that the chemistry of each system departed significantly from one or more of the assumptions implicit in the model used. A mathematical analysis of departures from the model's principal assumption has provided an explanation for this somewhat unexpected observation. The essential features of this analysis are presented in this paper together with the statistical analyses of the performance of the program. From these and the results obtained to date in the routine quality control of these 10 assays, it is concluded that the method of curve fitting and dose interpolation presented in this paper is likely to be of general applicability. (orig.) [de

  20. Assessment of health and economic effects by PM2.5 pollution in Beijing: a combined exposure-response and computable general equilibrium analysis.

    Science.gov (United States)

    Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun

    2016-12-01

    Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.

  1. Modeling tumor growth and irradiation response in vitro--a combination of high-performance computing and web-based technologies including VRML visualization.

    Science.gov (United States)

    Stamatakos, G S; Zacharaki, E I; Makropoulou, M I; Mouravliansky, N A; Marsh, A; Nikita, K S; Uzunoglu, N K

    2001-12-01

    A simplified three-dimensional Monte Carlo simulation model of in vitro tumor growth and response to fractionated radiotherapeutic schemes is presented in this paper. The paper aims at both the optimization of radiotherapy and the provision of insight into the biological mechanisms involved in tumor development. The basics of the modeling philosophy of Duechting have been adopted and substantially extended. The main processes taken into account by the model are the transitions between the cell cycle phases, the diffusion of oxygen and glucose, and the cell survival probabilities following irradiation. Specific algorithms satisfactorily describing tumor expansion and shrinkage have been applied, whereas a novel approach to the modeling of the tumor response to irradiation has been proposed and implemented. High-performance computing systems in conjunction with Web technologies have coped with the particularly high computer memory and processing demands. A visualization system based on the MATLAB software package and the virtual-reality modeling language has been employed. Its utilization has led to a spectacular representation of both the external surface and the internal structure of the developing tumor. The simulation model has been applied to the special case of small cell lung carcinoma in vitro irradiated according to both the standard and accelerated fractionation schemes. A good qualitative agreement with laboratory experience has been observed in all cases. Accordingly, the hypothesis that advanced simulation models for the in silico testing of tumor irradiation schemes could substantially enhance the radiotherapy optimization process is further strengthened. Currently, our group is investigating extensions of the presented algorithms so that efficient descriptions of the corresponding clinical (in vivo) cases are achieved.

  2. Activation and inhibition of retinal ganglion cells in response to epiretinal electrical stimulation: a computational modelling study

    Science.gov (United States)

    Abramian, Miganoosh; Lovell, Nigel H.; Morley, John W.; Suaning, Gregg J.; Dokos, Socrates

    2015-02-01

    Objective. Retinal prosthetic devices aim to restore sight in visually impaired people by means of electrical stimulation of surviving retinal ganglion cells (RGCs). This modelling study aims to demonstrate that RGC inhibition caused by high-intensity cathodic pulses greatly influences their responses to epiretinal electrical stimulation and to investigate the impact of this inhibition on spatial activation profiles as well as their implications for retinal prosthetic device design. Another aim is to take advantage of this inhibition to reduce axonal activation in the nerve fibre layer. Approach. A three-dimensional finite-element model of epiretinal electrical stimulation was utilized to obtain RGC activation and inhibition threshold profiles for a range of parameters. Main results. RGC activation and inhibition thresholds were highly dependent on cell and stimulus parameters. Activation thresholds were 1.5, 3.4 and 11.3 μA for monopolar electrodes with 5, 20 and 50 μm radii, respectively. Inhibition to activation threshold ratios were mostly within the range 2-10. Inhibition significantly altered spatial patterns of RGC activation. With concentric electrodes and appropriately high levels of stimulus amplitudes, activation of passing axons was greatly reduced. Significance. RGC inhibition significantly impacts their spatial activation profiles, and therefore it most likely influences patterns of perceived phosphenes induced by retinal prosthetic devices. Thus this inhibition should be taken into account in future studies concerning retinal prosthesis development. It might be possible to utilize this inhibitory effect to bypass activation of passing axons and selectively stimulate RGCs near their somas and dendrites to achieve more localized phosphenes.

  3. Integrin-Targeted Hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography for Imaging Tumor Progression and Early Response in Non-Small Cell Lung Cancer.

    Science.gov (United States)

    Ma, Xiaopeng; Phi Van, Valerie; Kimm, Melanie A; Prakash, Jaya; Kessler, Horst; Kosanke, Katja; Feuchtinger, Annette; Aichler, Michaela; Gupta, Aayush; Rummeny, Ernst J; Eisenblätter, Michel; Siveke, Jens; Walch, Axel K; Braren, Rickmer; Ntziachristos, Vasilis; Wildgruber, Moritz

    2017-01-01

    Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT) for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC) model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF) probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib) for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  5. Morphologic and Metabolic Comparison of Treatment Responsiveness with 18Fludeoxyglucose-Positron Emission Tomography/Computed Tomography According to Lung Cancer Type

    Directory of Open Access Journals (Sweden)

    Mehmet Fatih Börksüz

    2016-06-01

    Full Text Available Objective: The aim of the present study was to evaluate the response to treatment by histopathologic type in patients with lung cancer and under follow-up with 18F-fluoro-2deoxy-glucose-positron emission tomography/computed tomography (18F-FDG PET/CT imaging by using Response Evaluation Criteria in Solid Tumors (RECIST and European Organisation for Research and Treatment of Cancer (EORTC criteria that evaluate morphologic and metabolic parameters. Methods: On two separate (pre- and post-treatment 18F-FDG PET/CT images, the longest dimension of primary tumor as well as of secondary lesions were measured and sum of these two measurements was recorded as the total dimension in 40 patients. PET parameters such as standardized uptake value (SUVmax, metabolic volume and total lesion glycolysis (TLG were also recorded for these target lesions on two separate 18F-FDG PET/CT images. The percent (% change was calculated for all these parameters. Morphologic evaluation was based on RECIST 1.1 and the metabolic evaluation was based on EORTC. Results: When evaluated before and after treatment, in spite of the statistically significant change (p0.05. In histopathologic typing, when we compare the post-treatment phase change with the treatment responses of RECIST 1.1 and EORTC criteria; for RECIST 1.1 in squamous cell lung cancer group, progression was observed in sixteen patients (57%, stability in seven patients (25%, partial response in five patients (18%; and for EORTC progression was detected in four patients (14%, stability in thirteen patients (47%, partial response in eleven patients (39%, in 12 of these patients an increase in stage (43%, in 4 of them a decrease in stage (14%, and in 12 of them stability in stage (43% were determined. But in adenocancer patients (n=7, for RECIST 1.1, progression was determined in four patients (57%, stability in two patients (29%, partial response in one patient (14%; for EORTC, progression in one patient (14

  6. Can positron emission tomography/computed tomography with the dual tracers fluorine-18 fluoroestradiol and fluorodeoxyglucose predict neoadjuvant chemotherapy response of breast cancer?--A pilot study.

    Directory of Open Access Journals (Sweden)

    Zhongyi Yang

    Full Text Available OBJECTIVE: To assess the clinical value of dual tracers Positron emission tomography/computed tomography (PET/CT (18F-fluoroestradiol ((18F-FES and (18F-fluorodeoxyglucose ((18F-FDG in predicting neoadjuvant chemotherapy response (NAC of breast cancer. METHODS: Eighteen consecutive patients with newly diagnosed, non-inflammatory, stage II and III breast cancer undergoing NAC were included. Before chemotherapy, they underwent both (18F-FES and (18F-FDG PET/CT scans. Surgery was performed after three to six cycles of chemotherapy. Tumor response was graded and divided into two groups: the responders and non-responders. We used the maximum standardized uptake value (SUVmax to qualify each primary lesion. RESULTS: Pathologic analysis revealed 10 patients were responders while the other 8 patients were non-responders. There was no statistical difference of SUVmax-FDG and tumor size between these two groups (P>0.05. On the contrary, SUVmax-FES was lower in responders (1.75±0.66 versus 4.42±1.14; U=5, P=0.002; and SUVmax-FES/FDG also showed great value in predicting outcome (0.16±0.06 versus 0.54±0.22; U=5, P=0.002. CONCLUSIONS: Our study showed (18F-FES PET/CT might be feasible to predict response of NAC. However, whether the use of dual tracers (18F-FES and (18F-FDG has complementary value should be further studied.

  7. Mathematical modelling and computational study of two-dimensional and three-dimensional dynamics of receptor-ligand interactions in signalling response mechanisms.

    Science.gov (United States)

    García-Peñarrubia, Pilar; Gálvez, Juan J; Gálvez, Jesús

    2014-09-01

    Cell signalling processes involve receptor trafficking through highly connected networks of interacting components. The binding of surface receptors to their specific ligands is a key factor for the control and triggering of signalling pathways. But the binding process still presents many enigmas and, by analogy with surface catalytic reactions, two different mechanisms can be conceived: the first mechanism is related to the Eley-Rideal (ER) mechanism, i.e. the bulk-dissolved ligand interacts directly by pure three-dimensional (3D) diffusion with the specific surface receptor; the second mechanism is similar to the Langmuir-Hinshelwood (LH) process, i.e. 3D diffusion of the ligand to the cell surface followed by reversible ligand adsorption and subsequent two-dimensional (2D) surface diffusion to the receptor. A situation where both mechanisms simultaneously contribute to the signalling process could also occur. The aim of this paper is to perform a computational study of the behavior of the signalling response when these different mechanisms for ligand-receptor interactions are integrated into a model for signal transduction and ligand transport. To this end, partial differential equations have been used to develop spatio-temporal models that show trafficking dynamics of ligands, cell surface components, and intracellular signalling molecules through the different domains of the system. The mathematical modeling developed for these mechanisms has been applied to the study of two situations frequently found in cell systems: (a) dependence of the signal response on cell density; and (b) enhancement of the signalling response in a synaptic environment.

  8. Pulpal Responses to Direct Capping with Betamethasone/Gentamicin Cream and Mineral Trioxide Aggregate: Histologic and Micro-Computed Tomography Assessments.

    Science.gov (United States)

    AlShwaimi, Emad; Majeed, Abdul; Ali, Aiman A

    2016-01-01

    This clinical trial was conducted to evaluate the response of human dental pulp to direct capping with betamethasone/gentamicin (BG) cream and mineral trioxide aggregate (MTA). We hypothesized that the results of direct pulp capping with a topical BG combination would be similar to or better than those with MTA. Thirty-six human first premolar teeth scheduled for orthodontic extraction were randomly divided into 4 groups: BG1 group (n = 9), BG cream with 2-week follow-up; BG2 group (n = 10), BG cream with 8-week follow-up; MTA1 group (n = 8), MTA with 2-week follow-up; and MTA2 group (n = 9), MTA with 8-week follow-up. Teeth were extracted and evaluated at respective time intervals. Micro-computed tomography scanning and histologic analyses were performed for all specimens. Pulp pathology (inflammation, pulp abscesses, and pulp necrosis) and reparative reaction (formation of dentin bridges) were recorded. Both BG cream and MTA resulted in significantly better pulpal responses at 8 weeks than at 2 weeks. Dentin bridge formation was significantly thicker in the MTA group at 8 weeks than in any other group (P MTA. Direct pulp capping with both BG cream and MTA was associated with dentin bridge formation. MTA resulted in a significantly better pulpal response, with less inflammation and a thicker dentin bridge at 8 weeks. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Analysis of ex vivo drug response data of Plasmodium clinical isolates: the pros and cons of different computer programs and online platforms.

    Science.gov (United States)

    Wirjanata, Grennady; Handayuni, Irene; Zaloumis, Sophie G; Chalfein, Ferryanto; Prayoga, Pak; Kenangalem, Enny; Poespoprodjo, Jeanne Rini; Noviyanti, Rintis; Simpson, Julie A; Price, Ric N; Marfurt, Jutta

    2016-03-02

    In vitro drug susceptibility testing of malaria parasites remains an important component of surveillance for anti-malarial drug resistance. The half-maximal inhibition of growth (IC50) is the most commonly reported parameter expressing drug susceptibility, derived by a variety of statistical approaches, each with its own advantages and disadvantages. In this study, licensed computer programs WinNonlin and GraphPad Prism 6.0, and the open access programs HN-NonLin, Antimalarial ICEstimator (ICE), and In Vitro Analysis and Reporting Tool (IVART) were tested for their ease of use and ability to estimate reliable IC50 values from raw drug response data from 31 Plasmodium falciparum and 29 P. vivax clinical isolates tested with five anti-malarial agents: chloroquine, amodiaquine, piperaquine, mefloquine, and artesunate. The IC50 and slope estimates were similar across all statistical packages for all drugs tested in both species. There was good correlation of results derived from alternative statistical programs and non-linear mixed-effects modelling (NONMEM) which models all isolate data simultaneously. The user-friendliness varied between packages. While HN-NonLin and IVART allow users to enter the data in 96-well format, IVART and GraphPad Prism 6.0 are capable to analyse multiple isolates and drugs in parallel. WinNonlin, GraphPad Prism 6.0, IVART, and ICE provide alerts for non-fitting data and incorrect data entry, facilitating data interpretation. Data analysis using WinNonlin or ICE took the longest computationally, whilst the offline ability of GraphPad Prism 6.0 to analyse multiple isolates and drugs simultaneously made it the fastest among the programs tested. IC50 estimates obtained from the programs tested were comparable. In view of processing time and ease of analysis, GraphPad Prism 6.0 or IVART are best suited for routine and large-scale drug susceptibility testing.

  10. Positron emission tomography response criteria in solid tumours criteria for quantitative analysis of [18F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography for treatment response assessment in metastasised solid tumours: All that glitters is not gold.

    Science.gov (United States)

    Willemsen, Annelieke E C A B; Vlenterie, Myrella; van Herpen, Carla M L; van Erp, Nielka P; van der Graaf, Winette T A; de Geus-Oei, Lioe-Fee; Oyen, Wim J G

    2016-03-01

    For solid tumours, quantitative analysis of [(18)F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography potentially can have significant value in early response assessment and thereby discrimination between responders and non-responders at an early stage of treatment. Standardised strategies for this analysis have been proposed, and the positron emission tomography response criteria in solid tumours (PERCIST) criteria can be regarded as the current standard to perform quantitative analysis in a research setting, yet is not implemented in daily practice. However, several exceptions and limitations limit the feasibility of PERCIST criteria. In this article, we point out dilemmas that arise when applying proposed criteria like PERCIST on an expansive set of patients with metastasised solid tumours. Clinicians and scientists should be aware of these limitations to prevent that methodological issues impede successful introduction of research data into clinical practice. Therefore, to deliver on the high potential of quantitative imaging, consensus should be reached on a standardised, feasible and clinically useful analysis methodology. This methodology should be applicable in the majority of patients, tumour types and treatments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Lymphocyte density determined by computational pathology validated as a predictor of response to neoadjuvant chemotherapy in breast cancer: secondary analysis of the ARTemis trial.

    Science.gov (United States)

    Ali, H R; Dariush, A; Thomas, J; Provenzano, E; Dunn, J; Hiller, L; Vallier, A-L; Abraham, J; Piper, T; Bartlett, J M S; Cameron, D A; Hayward, L; Brenton, J D; Pharoah, P D P; Irwin, M J; Walton, N A; Earl, H M; Caldas, C

    2017-08-01

    We have previously shown lymphocyte density, measured using computational pathology, is associated with pathological complete response (pCR) in breast cancer. The clinical validity of this finding in independent studies, among patients receiving different chemotherapy, is unknown. The ARTemis trial randomly assigned 800 women with early stage breast cancer between May 2009 and January 2013 to three cycles of docetaxel, followed by three cycles of fluorouracil, epirubicin and cyclophosphamide once every 21 days with or without four cycles of bevacizumab. The primary endpoint was pCR (absence of invasive cancer in the breast and lymph nodes). We quantified lymphocyte density within haematoxylin and eosin (H&E) whole slide images using our previously described computational pathology approach: for every detected lymphocyte the average distance to the nearest 50 lymphocytes was calculated and the density derived from this statistic. We analyzed both pre-treatment biopsies and post-treatment surgical samples of the tumour bed. Of the 781 patients originally included in the primary endpoint analysis of the trial, 609 (78%) were included for baseline lymphocyte density analyses and a subset of 383 (49% of 781) for analyses of change in lymphocyte density. The main reason for loss of patients was the availability of digitized whole slide images. Pre-treatment lymphocyte density modelled as a continuous variable was associated with pCR on univariate analysis (odds ratio [OR], 2.92; 95% CI, 1.78-4.85; P < 0.001) and after adjustment for clinical covariates (OR, 2.13; 95% CI, 1.24-3.67; P = 0.006). Increased pre- to post-treatment lymphocyte density showed an independent inverse association with pCR (adjusted OR, 0.1; 95% CI, 0.033-0.31; P < 0.001). Lymphocyte density in pre-treatment biopsies was validated as an independent predictor of pCR in breast cancer. Computational pathology is emerging as a viable and objective means of identifying predictive biomarkers

  12. An Idle-State Detection Algorithm for SSVEP-Based Brain-Computer Interfaces Using a Maximum Evoked Response Spatial Filter.

    Science.gov (United States)

    Zhang, Dan; Huang, Bisheng; Wu, Wei; Li, Siliang

    2015-11-01

    Although accurate recognition of the idle state is essential for the application of brain-computer interfaces (BCIs) in real-world situations, it remains a challenging task due to the variability of the idle state. In this study, a novel algorithm was proposed for the idle state detection in a steady-state visual evoked potential (SSVEP)-based BCI. The proposed algorithm aims to solve the idle state detection problem by constructing a better model of the control states. For feature extraction, a maximum evoked response (MER) spatial filter was developed to extract neurophysiologically plausible SSVEP responses, by finding the combination of multi-channel electroencephalogram (EEG) signals that maximized the evoked responses while suppressing the unrelated background EEGs. The extracted SSVEP responses at the frequencies of both the attended and the unattended stimuli were then used to form feature vectors and a series of binary classifiers for recognition of each control state and the idle state were constructed. EEG data from nine subjects in a three-target SSVEP BCI experiment with a variety of idle state conditions were used to evaluate the proposed algorithm. Compared to the most popular canonical correlation analysis-based algorithm and the conventional power spectrum-based algorithm, the proposed algorithm outperformed them by achieving an offline control state classification accuracy of 88.0 ± 11.1% and idle state false positive rates (FPRs) ranging from 7.4 ± 5.6% to 14.2 ± 10.1%, depending on the specific idle state conditions. Moreover, the online simulation reported BCI performance close to practical use: 22.0 ± 2.9 out of the 24 control commands were correctly recognized and the FPRs achieved as low as approximately 0.5 event/min in the idle state conditions with eye open and 0.05 event/min in the idle state condition with eye closed. These results demonstrate the potential of the proposed algorithm for implementing practical SSVEP BCI systems.

  13. A computational method for determination of a frequency response characteristic of flexibly supported rigid rotors attenuated by short magnetorheological squeeze film dampers

    Directory of Open Access Journals (Sweden)

    Zapoměl J.

    2011-06-01

    Full Text Available Lateral vibration of rotors can be significantly reduced by inserting the damping elements between the shaft and the casing. The theoretical analysis, confirmed by computational simulations, shows that to achieve the optimum compromise between attenuation of the oscillation amplitude and magnitude of the forces transmitted through the coupling elements between the rotor and the stationary part, the damping effect must be controllable. For this purpose, the squeeze film dampers lubricated by magnetorheological fluid can be applied. The damping effect is controlled by the change of intensity of the magnetic field in the lubricating film. This article presents a procedure developed for investigation of the steady state response of rigid rotors coupled with the casing by flexible elements and short magnetorheological dampers. Their lateral vibration is governed by nonlinear (due to the damping forces equations of motion. The steady state solution is obtained by application of a collocation method, which arrives at solving a set of nonlinear algebraic equations. The pressure distribution in the oil film is described by a Reynolds equation modified for the case of short dampers and Bingham fluid. Components of the damping force are calculated by integration of the pressure distribution around the circumference and along the length of the damper. The developed procedure makes possible to determine the steady state response of rotors excited by their unbalance, to determine magnitude of the forces transmitted through the coupling elements in the supports into the stationary part and is intended for proposing the control of the damping effect to achieve optimum performance of the dampers.

  14. Concurrent La and A-Site Vacancy Doping Modulates the Thermoelectric Response of SrTiO3: Experimental and Computational Evidence.

    Science.gov (United States)

    Azough, Feridoon; Jackson, Samuel S; Ekren, Dursun; Freer, Robert; Molinari, Marco; Yeandel, Stephen R; Panchmatia, Pooja M; Parker, Stephen C; Maldonado, David Hernandez; Kepaptsoglou, Demie M; Ramasse, Quentin M

    2017-12-06

    To help understand the factors controlling the performance of one of the most promising n-type oxide thermoelectric SrTiO 3 , we need to explore structural control at the atomic level. In Sr 1-x La 2x/3 TiO 3 ceramics (0.0 ≤ x ≤ 0.9), we determined that the thermal conductivity can be reduced and controlled through an interplay of La-substitution and A-site vacancies and the formation of a layered structure. The decrease in thermal conductivity with La and A-site vacancy substitution dominates the trend in the overall thermoelectric response. The maximum dimensionless figure of merit is 0.27 at 1070 K for composition x = 0.50 where half of the A-sites are occupied with La and vacancies. Atomic resolution Z-contrast imaging and atomic scale chemical analysis show that as the La content increases, A-site vacancies initially distribute randomly (x thermal conductivity, an important route to enhancement of the thermoelectric performance. A computational study confirmed that the thermal conductivity of SrTiO 3 is lowered by the introduction of La and A-site vacancies as shown by the experiments. The modeling supports that a critical mass of A-site vacancies is needed to reduce thermal conductivity and that the arrangement of La, Sr, and A-site vacancies has a significant impact on thermal conductivity only at high La concentration.

  15. Primary pulmonary lymphoma-role of fluoro-deoxy-glucose positron emission tomography-computed tomography in the initial staging and evaluating response to treatment - case reports and review of literature

    International Nuclear Information System (INIS)

    Agarwal, Krishan Kant; Dhanapathi, Halanaik; Nazar, Aftab Hasan; Kumar, Rakesh

    2016-01-01

    Primary pulmonary lymphoma (PPL) is an uncommon entity of non-Hodgkin lymphoma, which accounts for <1% of all cases of lymphoma. We present two rare cases of PPL of diffuse large B-cell lymphoma, which underwent 18 fluorine fluoro-deoxy-glucose positron emission tomography-computed tomography for initial staging and response evaluation after chemotherapy

  16. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to Movie Trailers

    Science.gov (United States)

    Christoforou, Christoforos; Papadopoulos, Timothy C.; Constantinidou, Fofi; Theodorou, Maria

    2017-01-01

    The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer’s decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups) have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG) and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers’ decision to watch a movie. Towards that, we recorded neural activity—through the use of EEG—and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie’s market performance (i.e., box-office ticket sales in the US). Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films’ performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain’s rewards network. Beyond the practical

  17. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to Movie Trailers

    Directory of Open Access Journals (Sweden)

    Christoforos Christoforou

    2017-12-01

    Full Text Available The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer’s decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers’ decision to watch a movie. Towards that, we recorded neural activity—through the use of EEG—and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie’s market performance (i.e., box-office ticket sales in the US. Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films’ performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain’s rewards network. Beyond the

  18. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer's Brain Responses to Movie Trailers.

    Science.gov (United States)

    Christoforou, Christoforos; Papadopoulos, Timothy C; Constantinidou, Fofi; Theodorou, Maria

    2017-01-01

    The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer's decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups) have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG) and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers' decision to watch a movie. Towards that, we recorded neural activity-through the use of EEG-and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie's market performance (i.e., box-office ticket sales in the US). Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films' performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain's rewards network. Beyond the practical implication in

  19. The impact of irradiation dose on the computed tomography radiographic response of metastatic nodes and clinical outcomes in cervix cancer in a low-resource setting

    Science.gov (United States)

    McKeever, Matthew Ryan; Hwang, Lindsay; Barclay, Jennifer; Xi, Yin; Bailey, April; Albuquerque, Kevin

    2017-01-01

    Introduction: The aim of this study is to investigate the relationship between the radiation dose to pelvic and para-aortic lymph nodes, nodal response, and clinical outcomes in a resource-poor setting based on computed tomography (CT) nodal size alone. Materials and Methods: This retrospective study from 2009 to 2015 included 46 cervical cancer patients with 133 metastatic pelvic and para-aortic lymph nodes definitively treated with chemoradiation and brachytherapy in a public hospital with limited access to positron emission tomography (PET) scans. Hence, short axis of the lymph node on CT scan was used as a measure of metastatic nodal disease, before and following radiation therapy. Inclusion criteria required the pelvic and para-aortic nodes to have the shortest axis diameter on CT scan of ≥8 mm and ≥10 mm, respectively. Based on PET resolution, a node that decreased to half of its inclusion cutoff size was considered to have a complete response (CR). Relevant clinical outcomes were documented and correlated with nodal features, nodal radiation doses, and treatment characteristics. Results: After controlling for other predictive factors, increased nodal dose was associated with increased probability of CR per study definition (P = 0.005). However, there was no statistically significant association between dose and pelvic/para-aortic, distant and total recurrence (TR), and any recurrence at any location (P = 0.263, 0.785, 1.00, respectively). Patients who had no CR nodes had shorter pelvic/para-aortic recurrence-free survival (PPRFS) and TR-free survival (TRFS) than patients who had at least one CR node (P = 0.027 and 0.046, respectively). Patients with no CR nodes also had shorter PPRFS than patients who had all nodes completely respond (P < 0.05). Conclusions: Using CT-based measures, we found that increased nodal dose is associated with an increased probability of CR (as defined) and nodal CR is associated with increased PPRFS and TRFS. We were unable to

  20. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    A computing grid interconnects resources such as high per- formance computers, scientific databases, and computer- controlled scientific instruments of cooperating organiza- tions each of which is autonomous. It precedes and is quite different from cloud computing, which provides computing resources by vendors to ...

  1. The Effect of Device When Using Smartphones and Computers to Answer Multiple-Choice and Open-Response Questions in Distance Education

    Science.gov (United States)

    Wilson, Thomas Royce

    2017-01-01

    Traditionally in higher education, online courses have been designed for computer users. However, the advent of mobile learning (m-learning) and the proliferation of smartphones have created two challenges for online students and instructional designers. First, instruction designed for a larger computer screen often loses its effectiveness when…

  2. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  3. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  4. Routine Self-administered, Touch-Screen Computer Based Suicidal Ideation Assessment Linked to Automated Response Team Notification in an HIV Primary Care Setting

    Science.gov (United States)

    Lawrence, Sarah T.; Willig, James H.; Crane, Heidi M.; Ye, Jiatao; Aban, Inmaculada; Lober, William; Nevin, Christa R.; Batey, D. Scott; Mugavero, Michael J.; McCullumsmith, Cheryl; Wright, Charles; Kitahata, Mari; Raper, James L.; Saag, Micheal S.; Schumacher, Joseph E.

    2010-01-01

    Summary The implementation of routine computer-based screening for suicidal ideation and other psychosocial domains through standardized patient reported outcome instruments in two high volume urban HIV clinics is described. Factors associated with an increased risk of self-reported suicidal ideation were determined. Background HIV/AIDS continues to be associated with an under-recognized risk for suicidal ideation, attempted as well as completed suicide. Suicidal ideation represents an important predictor for subsequent attempted and completed suicide. We sought to implement routine screening of suicidal ideation and associated conditions using computerized patient reported outcome (PRO) assessments. Methods Two geographically distinct academic HIV primary care clinics enrolled patients attending scheduled visits from 12/2005 to 2/2009. Touch-screen-based, computerized PRO assessments were implemented into routine clinical care. Substance abuse (ASSIST), alcohol consumption (AUDIT-C), depression (PHQ-9) and anxiety (PHQ-A) were assessed. The PHQ-9 assesses the frequency of suicidal ideation in the preceding two weeks. A response of “nearly every day” triggered an automated page to pre-determined clinic personnel who completed more detailed self-harm assessments. Results Overall 1,216 (UAB= 740; UW= 476) patients completed initial PRO assessment during the study period. Patients were white (53%; n=646), predominantly males (79%; n=959) with a mean age of 44 (± 10). Among surveyed patients, 170 (14%) endorsed some level of suicidal ideation, while 33 (3%) admitted suicidal ideation nearly every day. In multivariable analysis, suicidal ideation risk was lower with advancing age (OR=0.74 per 10 years;95%CI=0.58-0.96) and was increased with current substance abuse (OR=1.88;95%CI=1.03-3.44) and more severe depression (OR=3.91 moderate;95%CI=2.12-7.22; OR=25.55 severe;95%CI=12.73-51.30). Discussion Suicidal ideation was associated with current substance abuse and

  5. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  6. Computer Use by Rural Principals.

    Science.gov (United States)

    Witten, D. W.; And Others

    Very little research is available nationwide that measures the administrative use of computers in rural schools. A state survey of 154 rural Kentucky secondary school principals (representing a 51% response rate) focused on their knowledge about computers and use of computers for school administrative purposes. Only 14% of respondents had a…

  7. A computational approach to the description of individual immune responses. IgE and IgG-subclass allergen-specific antibodies formed during immunotherapy

    DEFF Research Database (Denmark)

    Søndergaard, I; Poulsen, L K; Osterballe, O

    1991-01-01

    are "distance" between antibody responses and "immune response width". The 20 patients included in this study were pollen-allergic patients who underwent specific immunotherapy in a 3-year prospective study. It was found that the immune response during immunotherapy was restricted to IgG1 and IgG4 antibodies......Detailed evaluation of the IgE and IgG-subclass immune response during immunotherapy can now be performed by crossed radio immunoelectrophoresis (CRIE). Some new concepts are introduced facilitating the handling of the vast amount of data obtained by quantitating the immune response. These concepts...... and decreased towards six. For the IgG4 antibodies the number of reactions increased towards 15 antigens and decreased towards four. The increase is generally paralleled by an increase in quantitative immune response as well. For some of the antigens a rise in the IgE antibodies is contrasted by a fall...

  8. Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis: Incorporating Protein Synthesis in Improving Predictability of Responses to Endocrine Active Chemicals

    Science.gov (United States)

    There is international concern about chemicals that alter endocrine system function in humans and/or wildlife and subsequently cause adverse effects. We previously developed a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minno...

  9. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings

    NARCIS (Netherlands)

    Revell, A. D.; Wang, D.; Wood, R.; Morrow, C.; Tempelman, H.; Hamers, R. L.; Alvarez-Uria, G.; Streinu-Cercel, A.; Ene, L.; Wensing, A. M. J.; DeWolf, F.; Nelson, M.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2013-01-01

    Genotypic HIV drug-resistance testing is typically 6065 predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART

  10. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  11. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    The problematic addressed in the dissertation is generally shaped by a sensation that something is amiss within the area of Ubiquitous Computing. Ubiquitous Computing as a vision—as a program—sets out to challenge the idea of the computer as a desktop computer and to explore the potential...... of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...... to solve well-defined problems within specified contexts—something that rarely exists in practice. Nonetheless, the computer will continue to grow more ubiquitous as moore's law still apply and as its components become ever cheaper. The question is how, and for what we will use it? How will it...

  12. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  13. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  14. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  15. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  16. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  17. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  18. Grid Computing

    CERN Document Server

    Yen, Eric

    2008-01-01

    Based on the Grid Computing: International Symposium on Grid Computing (ISGC) 2007, held in Taipei, Taiwan in March of 2007, this title presents the grid solutions and research results in grid operations, grid middleware, biomedical operations, and e-science applications. It is suitable for graduate-level students in computer science.

  19. Optical Computing

    Indian Academy of Sciences (India)

    Optics has been used in computing for a number of years but the main emphasis has been and continues to be to link portions of computers, for communications, or more intrin- sically in devices that have some optical application or component (optical pattern recognition, etc). Optical digi- tal computers are still some years ...

  20. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  1. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  2. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  3. The Diffraction Response Interpolation Method

    DEFF Research Database (Denmark)

    Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Pedersen, Peder C.

    1998-01-01

    Computer modeling of the output voltage in a pulse-echo system is computationally very demanding, particularly whenconsidering reflector surfaces of arbitrary geometry. A new, efficient computational tool, the diffraction response interpolationmethod (DRIM), for modeling of reflectors in a fluid ...

  4. Radiology education 2.0--on the cusp of change: part 1. Tablet computers, online curriculums, remote meeting tools and audience response systems.

    Science.gov (United States)

    Bhargava, Puneet; Lackey, Amanda E; Dhand, Sabeen; Moshiri, Mariam; Jambhekar, Kedar; Pandey, Tarun

    2013-03-01

    We are in the midst of an evolving educational revolution. Use of digital devices such as smart phones and tablet computers is rapidly increasing among radiologists who now regularly use them for medical, technical, and administrative tasks. These electronic tools provide a wide array of new tools to the radiologists allowing for faster, more simplified, and widespread distribution of educational material. The utility, future potential, and limitations of some these powerful tools are discussed in this article. Published by Elsevier Inc.

  5. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  6. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  7. Young Children’s Affective Responses to Acceptance and Rejection From Peers: A Computer-based Task Sensitive to Variation in Temperamental Shyness and Gender

    Science.gov (United States)

    Howarth, Grace Z.; Guyer, Amanda E.; Pérez-Edgar, Koraly

    2013-01-01

    This study presents a novel task examining young children’s affective responses to evaluative feedback—specifically, social acceptance and rejection—from peers. We aimed to determine (1) whether young children report their affective responses to hypothetical peer evaluation predictably and consistently, and (2) whether young children’s responses to peer evaluation vary as a function of temperamental shyness and gender. Four- to seven-year-old children (N = 48) sorted pictures of unknown, similar-aged children into those with whom they wished or did not wish to play. Computerized peer evaluation later noted whether the pictured children were interested in a future playdate with participants. Participants then rated their affective responses to each acceptance or rejection event. Children were happy when accepted by children with whom they wanted to play, and disappointed when these children rejected them. Highly shy boys showed a wider range of responses to acceptance and rejection based on initial social interest, and may be particularly sensitive to both positive and negative evaluation. Overall, the playdate task captures individual differences in affective responses to evaluative peer feedback and is potentially amenable to future applications in research with young children, including pairings with psychophysiological measures. PMID:23997429

  8. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  9. The use of computed tomography scans and the Bender Gestalt Test in the assessment of competency to stand trial and criminal responsibility in the field of mental health and law.

    Science.gov (United States)

    Mosotho, Nathaniel Lehlohonolo; Timile, Ino; Joubert, Gina

    computed tomography and the Bender Gestalt Test are some of the tests used routinely for the assessment of alleged offenders referred under Sections 77 and 78 of the Criminal Procedure Act 51 of 1977. An exploratory retrospective study was conducted at the Free State Psychiatric Complex. The aim of this study was to identify the extent to which the Bender Gestalt Test results and the computed tomography scans are associated with outcomes in the assessment of competency to stand trial and criminal responsibility in individuals referred to the Free State Psychiatric Complex (FSPC) observation unit. This was a cross-sectional study and the entire population of patients admitted in 2013 was included in the study. The clinical and demographic data were obtained from patient files. The majority of participants were black, males, single and unemployed. The most common diagnosis was schizophrenia. The current study showed no statistically significant association between the Bender Gestalt Test Hain's scores and the outcome of criminal responsibility and competency to stand trial. Similarly, the study also showed no statistically significant association between the presence of a brain lesion and the outcome of criminal responsibility and competency to stand trial. It was also concluded that as CT scans are expensive, patients should be referred for that service only when there is a clear clinical indication to do so. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  11. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  12. Computed Tomography

    Science.gov (United States)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  13. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  14. Leaves to landscapes: using high performance computing to assess patch-scale forest response to regional temperature and trace gas gradients

    Science.gov (United States)

    George E. Host; Harlan W. Stech; Kathryn E. Lenz; Kyle Roskoski; Richard Mather; Michael Donahue

    2007-01-01

    ECOPHYS is one of the early FSTM's that integrated plant physiological and tree architectural models to assess the relative importance of genetic traits in tree growth, and explore the growth response to interacting environmental stresses (Host et al 1999, Isebrands et al 1999, Martin et al 2001). This paper will describe extensions of the ECOPHYS individual tree...

  15. A computational methodology for a micro launcher engine test bench using a combined linear static and dynamic in frequency response analysis

    Directory of Open Access Journals (Sweden)

    Ion DIMA

    2017-03-01

    Full Text Available This article aims to provide a quick methodology to determine the critical values of the forces, displacements and stress function of frequency, under a combined linear static (101 Solution - Linear Static and dynamic load in frequency response (108 Solution - Frequency Response, Direct Method, applied to a micro launcher engine test bench, using NASTRAN 400 Solution - Implicit Nonlinear. NASTRAN/PATRAN software is used. Practically in PATRAN the preprocessor has to define a linear or nonlinear static load at step 1 and a dynamic in frequency response load (time dependent at step 2. In Analyze the following options are chosen: for Solution Type Implicit Nonlinear Solution (SOL 400 is selected, for Subcases Static Load and Transient Dynamic is chosen and for Subcase Select the two cases static and dynamic will be selected. NASTRAN solver will overlap results from static analysis with the dynamic analysis. The running time will be reduced three times if using Krylov solver. NASTRAN SYSTEM (387 = -1 instruction is used in order to activate Krylov option. Also, in Analysis the OP2 Output Format shall be selected, meaning that in bdf NASTRAN input file the PARAM POST 1 instruction shall be written. The structural damping can be defined in two different ways: either at the material card or using the PARAM, G, 0.05 instruction (in this example a damping coefficient by 5% was used. The SDAMPING instruction in pair with TABDMP1 work only for dynamic in frequency response, modal method, or in direct method with viscoelastic material, not for dynamic in frequency response, direct method (DFREQ, with linear elastic material. The Direct method – DFREQ used in this example is more accurate. A set in translation of boundary conditions was used and defined at the base of the test bench.

  16. Human-computer interaction in multitask situations

    Science.gov (United States)

    Rouse, W. B.

    1977-01-01

    Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.

  17. Development of a dual-energy computed tomography quality control program: Characterization of scanner response and definition of relevant parameters for a fast-kVp switching dual-energy computed tomography system.

    Science.gov (United States)

    Nute, Jessica L; Jacobsen, Megan C; Stefan, Wolfgang; Wei, Wei; Cody, Dianna D

    2018-04-01

    A prototype QC phantom system and analysis process were developed to characterize the spectral capabilities of a fast kV-switching dual-energy computed tomography (DECT) scanner. This work addresses the current lack of quantitative oversight for this technology, with the goal of identifying relevant scan parameters and test metrics instrumental to the development of a dual-energy quality control (DEQC). A prototype elliptical phantom (effective diameter: 35 cm) was designed with multiple material inserts for DECT imaging. Inserts included tissue equivalent and material rods (including iodine and calcium at varying concentrations). The phantom was scanned on a fast kV-switching DECT system using 16 dual-energy acquisitions (CTDIvol range: 10.3-62 mGy) with varying pitch, rotation time, and tube current. The circular head phantom (22 cm diameter) was scanned using a similar protocol (12 acquisitions; CTDIvol range: 36.7-132.6 mGy). All acquisitions were reconstructed at 50, 70, 110, and 140 keV and using a water-iodine material basis pair. The images were evaluated for iodine quantification accuracy, stability of monoenergetic reconstruction CT number, noise, and positional constancy. Variance component analysis was used to identify technique parameters that drove deviations in test metrics. Variances were compared to thresholds derived from manufacturer tolerances to determine technique parameters that had a nominally significant effect on test metrics. Iodine quantification error was largely unaffected by any of the technique parameters investigated. Monoenergetic HU stability was found to be affected by mAs, with a threshold under which spectral separation was unsuccessful, diminishing the utility of DECT imaging. Noise was found to be affected by CTDIvol in the DEQC body phantom, and CTDIvol and mA in the DEQC head phantom. Positional constancy was found to be affected by mAs in the DEQC body phantom and mA in the DEQC head phantom. A streamlined scan protocol

  18. JAC2D: A two-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method

    International Nuclear Information System (INIS)

    Biffle, J.H.; Blanford, M.L.

    1994-05-01

    JAC2D is a two-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equations. The method is implemented in a two-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. A four-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic/plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere

  19. JAC3D -- A three-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method

    International Nuclear Information System (INIS)

    Biffle, J.H.

    1993-02-01

    JAC3D is a three-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equation. The method is implemented in a three-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. An eight-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic-plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere

  20. Visualizing the BEC-BCS crossover in a two-dimensional Fermi gas: Pairing gaps and dynamical response functions from ab initio computations

    Science.gov (United States)

    Vitali, Ettore; Shi, Hao; Qin, Mingpu; Zhang, Shiwei

    2017-12-01

    Experiments with ultracold atoms provide a highly controllable laboratory setting with many unique opportunities for precision exploration of quantum many-body phenomena. The nature of such systems, with strong interaction and quantum entanglement, makes reliable theoretical calculations challenging. Especially difficult are excitation and dynamical properties, which are often the most directly relevant to experiment. We carry out exact numerical calculations, by Monte Carlo sampling of imaginary-time propagation of Slater determinants, to compute the pairing gap in the two-dimensional Fermi gas from first principles. Applying state-of-the-art analytic continuation techniques, we obtain the spectral function and the density and spin structure factors providing unique tools to visualize the BEC-BCS crossover. These quantities will allow for a direct comparison with experiments.

  1. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  2. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  3. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  4. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  6. Cloud Computing

    Indian Academy of Sciences (India)

    IAS Admin

    2014-03-01

    Mar 1, 2014 ... decade in computing. In this article we define cloud computing, various services available on the cloud infrastructure, and the different types of cloud. We then discuss the technological trends which have led to its emergence, its advantages and disadvan- tages, and the applications which are appropriate ...

  7. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  8. Computer Insecurity.

    Science.gov (United States)

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  9. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  10. Cloud Computing

    Indian Academy of Sciences (India)

    IAS Admin

    2014-03-01

    Mar 1, 2014 ... Thus the availability of computing as a utility which allows organizations to pay service providers for what they use and eliminates the need to budget huge amounts to buy and maintain large computing infrastructure is a welcome development. Amazon, an e-commerce company, started operations in 1995.

  11. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    this understanding could entail in terms of developing new expressional appearances of computational technology, new ways of working with it, and new technological possibilities. The investigations are carried out in relation to, or as part of three experiments with computers and materials (PLANKS, Copper...

  12. Computed Tomography (CT) Perfusion as an Early Predictive Marker for Treatment Response to Neoadjuvant Chemotherapy in Gastroesophageal Junction Cancer and Gastric Cancer - A Prospective Study

    DEFF Research Database (Denmark)

    Lundsgaard Hansen, Martin; Fallentin, Eva; Lauridsen, Carsten

    2014-01-01

    OBJECTIVES: To evaluate whether early reductions in CT perfusion parameters predict response to pre-operative chemotherapy prior to surgery for gastroesophageal junction (GEJ) and gastric cancer. MATERIALS AND METHODS: Twenty-eight patients with adenocarcinoma of the gastro-esophageal junction (GEJ......) and stomach were included. Patients received three series of chemotherapy before surgery, each consisting of a 3-week cycle of intravenous epirubicin, cisplatin or oxaliplatin, concomitant with capecitabine peroral. The patients were evaluated with a CT perfusion scan prior to, after the first series of......-operative chemotherapy in GEJ and gastric cancer. As a single diagnostic test, CT Perfusion only has moderate sensitivity and specificity in response assessment of pre-operative chemotherapy making it insufficient for clinical decision purposes....

  13. A Computer Simulation for Predicting the Time Course of Thermal and Cardiovascular Responses to Various Combinations of Heat Stress, Clothing and Exercise

    Science.gov (United States)

    1991-06-01

    Sutalation for Predicting the Time Cou~rse of ’Thermal and Cardiovrascular Responses to varicus Cmtinations of Heat Stresso Clothing and Excercise 6. AUTHOR... fat , vascular skin, avascular skin) and an interconnecting central blood compartment (Figure 1). A one segment model of the passive system was chosen...e Muscle - Fat -Vascular skin~ AvascLular skin (Ter) (Trru) (Tfat) (Tvsk) (Tsk) Central blood compartment (Tbl): 2.1 kg Figure 1. Cross-section of

  14. MLJ Computer Corner.

    Science.gov (United States)

    Brink, Dan

    1986-01-01

    To solve the software piracy problem in education, teachers and professionals in positions of responsibility relative to computing must refuse to "go along with" the practice of unauthorized piracy. Software producers must work out agreements that will permit educational institutions to have quality software at reasonable prices. (CB)

  15. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  16. Correlation of abnormal response of left ventricular ejection fraction after exercise and left ventricular cavity-to-myocardium count ratio of technetium-99m-tetrofosmin single photon emission computed tomography in patients with coronary artery disease

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Hueisch-J [Kaohsiung Medical Univ., Taiwan (China). School of Technology for Medical Science; Lin, Ching-C; Wang, Jhi-J [Chi-Mei Medical Center, Tainan, Taiwan (China); Ho, Shung-T [National Defense Medical Center, Taipei, Taiwan (China). School of Medicine; Kao, Albert [China Medical Coll., Taichung, Taiwan (China). Hospital

    2002-09-01

    The aim of this study was to assess the value of the left ventricular cavity-to-myocardium count ratio (C/M ratio) of technetium-99m (Tc-99m) tetrofosmin single photon emission computed tomography (SPECT) to identify abnormal left ventricular ejection fraction (LVEF) responses after exercise in patients with coronary artery diseases (CAD). We studied 50 patients with recent CAD undergoing rest and exercise first-pass ventriculography to calculate LVEF and rest and exercise Tc-99m tetrofosmin myocardial perfusion SPECT to calculate left ventricular C/M ratios. Group A, consisting of 25 CAD patients with normal responses (increased LVEF{>=}5% after exercise), had significantly higher rest and exercise C/M ratios than those of the group B, consisting of 25 CAD patients with abnormal responses (increased LVEF <5% after exercise) after exercise. However, the C/M ratios between exercise and rest did not differ significantly between groups A and B. In addition, there was significant correlation between LVEF and C/M ratios in all of the patients. C/M ratios of Tc-99m tetrofosmin myocardial perfusion SPECT are useful parameters for identifying patients with abnormal LVEF responses among patients with CAD. (author)

  17. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  18. Computer security engineering management

    International Nuclear Information System (INIS)

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system

  19. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  20. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  1. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  2. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  3. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  4. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  5. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  6. Monte Carlo Study of the Effect of Collimator Thickness on T-99m Source Response in Single Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Islamian, Jalil Pirayesh; Toossi, Mohammad Taghi Bahreyni; Momennezhad, Mahdi; Zakavi, Seyyed Rasoul; Sadeghi, Ramin; Ljungberg, Michael

    2012-01-01

    In single photon emission computed tomography (SPECT), the collimator is a crucial element of the imaging chain and controls the noise resolution tradeoff of the collected data. The current study is an evaluation of the effects of different thicknesses of a low-energy high-resolution (LEHR) collimator on tomographic spatial resolution in SPECT. In the present study, the SIMIND Monte Carlo program was used to simulate a SPECT equipped with an LEHR collimator. A point source of 99m Tc and an acrylic cylindrical Jaszczak phantom, with cold spheres and rods, and a human anthropomorphic torso phantom (4D-NCAT phantom) were used. Simulated planar images and reconstructed tomographic images were evaluated both qualitatively and quantitatively. According to the tabulated calculated detector parameters, contribution of Compton scattering, photoelectric reactions, and also peak to Compton (P/C) area in the obtained energy spectrums (from scanning of the sources with 11 collimator thicknesses, ranging from 2.400 to 2.410 cm), we concluded the thickness of 2.405 cm as the proper LEHR parallel hole collimator thickness. The image quality analyses by structural similarity index (SSIM) algorithm and also by visual inspection showed suitable quality images obtained with a collimator thickness of 2.405 cm. There was a suitable quality and also performance parameters’ analysis results for the projections and reconstructed images prepared with a 2.405 cm LEHR collimator thickness compared with the other collimator thicknesses

  7. Intermediality between Games and Fiction: The “Ludology vs. Narratology” Debate in Computer Game Studies: A Response to Gonzalo Frasca

    Directory of Open Access Journals (Sweden)

    Kokonis Michalis

    2014-12-01

    Full Text Available In the last ten or fourteen years there has been a debate among the so called ludologists and narratologists in Computer Games Studies as to what is the best methodological approach for the academic study of electronic games. The aim of this paper is to propose a way out of the dilemma, suggesting that both ludology and narratology can be helpful methodologically. However, there is need for a wider theoretical perspective, that of semiotics, in which both approaches can be operative. The semiotic perspective proposed allows research in the field to focus on the similarities between games and traditional narrative forms (since they share narrativity to a greater or lesser extent as well as on their difference (they have different degrees of interaction; it will facilitate communication among theorists if we want to understand each other when talking about games and stories, and it will lead to a better understanding of the hybrid nature of the medium of game. In this sense the present paper aims to complement Gonzalo Frasca’s reconciliatory attempt made a few years back and expand on his proposal.

  8. A Large-Scale Computational Analysis of Corneal Structural Response and Ectasia Risk in Myopic Laser Refractive Surgery (An American Ophthalmological Society Thesis)

    Science.gov (United States)

    Dupps, William Joseph; Seven, Ibrahim

    2016-01-01

    Purpose: To investigate biomechanical strain as a structural susceptibility metric for corneal ectasia in a large-scale computational trial. Methods: A finite element modeling study was performed using retrospective Scheimpflug tomography data from 40 eyes of 40 patients. LASIK and PRK were simulated with varied myopic ablation profiles and flap thickness parameters across eyes from LASIK candidates, patients disqualified for LASIK, subjects with atypical topography, and keratoconus subjects in 280 simulations. Finite element analysis output was then interrogated to extract several risk and outcome variables. We tested the hypothesis that strain is greater in known at-risk eyes than in normal eyes, evaluated the ability of a candidate strain variable to differentiate eyes that were empirically disqualified as LASIK candidates, and compared the performance of common risk variables as predictors of this novel susceptibility marker across multiple virtual subjects and surgeries. Results: A candidate susceptibility metric that expressed mean strains across the anterior residual stromal bed was significantly higher in eyes with confirmed ectatic predisposition in preoperative and all postoperative cases (P≤.003). The strain metric was effective at differentiating normal and at-risk eyes (area under receiver operating characteristic curve ≥ 0.83, P≤.002), was highly correlated to thickness-based risk metrics (as high as R2 = 95%, Pectasia risk and provides a novel biomechanical construct for expressing structural risk in refractive surgery. Mechanical strain is an effective marker of known ectasia risk and correlates to predicted refractive error after myopic photoablative surgery. PMID:27630372

  9. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Directory of Open Access Journals (Sweden)

    Deepali Bisht

    2013-01-01

    Full Text Available Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100. The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L-1; starch, 15.0 g.L-1; triton-X-100, 0.93 mL.L-1; incubation temperature, 34.12 ºC and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL-1. The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R² value (0.9987. The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization.

  10. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Science.gov (United States)

    Bisht, Deepali; Yadav, Santosh Kumar; Darmwal, Nandan Singh

    2013-01-01

    Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100). The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L−1; starch, 15.0 g.L−1; triton-X-100, 0.93 mL.L−1; incubation temperature, 34.12 °C and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL−1). The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R2) value (0.9987). The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization. PMID:24159311

  11. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  12. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  13. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  14. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  15. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  16. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  17. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  18. A computational investigation of the transient response of an unbalanced rigid rotor flexibly supported and damped by short magnetorheological squeeze film dampers

    Science.gov (United States)

    Zapoměl, J.; Ferfecki, P.; Forte, P.

    2012-10-01

    Due to manufacturing and assembly inaccuracies, real rotors are always slightly imbalanced. This produces their lateral vibration and forces that are transmitted through the bearings to the stationary parts. The oscillation of the system can be reduced if damping devices are added to the constraint elements. To achieve the optimum performance of the rotor in a wide range of angular velocities and when passing through the critical speeds their damping effect must be controllable. For this purpose, the application of semiactive magnetorheological (MR) dampers has been analysed. The investigated problem focuses on studying the influence of their damping effect and of its control on the amplitude of the rotor vibration, on the magnitude of the force transmitted to the rotor casing, and on the amount of dissipative power generated in the MR films. The developed mathematical model assumes cavitation in the lubricating layer, and the MR liquid is modelled as a Bingham material. The derivation of the equation governing the pressure distribution in the oil film is completed by a new methodology making it possible to determine the yielding shear stress needed for its solution. The equations of motion of the rotor are nonlinear due to the damping forces and to solve them a Runge-Kutta integration method was applied. Computer simulations show that a suitably proposed current-rotor angular speed relationship enables one to fully eliminate the resonance peaks and to achieve the optimum compromise between the attenuation of the rotor lateral vibration, the magnitude of the forces transmitted to the rotor casing and the amount of energy dissipated in the lubricating layers.

  19. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  20. Computers in radiology. The sedation, analgesia, and contrast media computerized simulator: a new approach to train and evaluate radiologists' responses to critical incidents

    Energy Technology Data Exchange (ETDEWEB)

    Medina, L.S.; Racadio, J.M. [Dept. of Radiology, Children' s Hospital Medical Center, Cincinnati, OH (United States); Schwid, H.A. [Dept. of Anesthesia, Veterans Administration Medical Center, University of Washington, Seattle, WA (United States)

    2000-05-01

    Background. Awareness and preparedness to handle sedation, analgesia, and contrast-media complications are key in the daily radiology practice. Objective. The purpose is to create a computerized simulator (PC-Windows-based) that uses a graphical interface to reproduce critical incidents in pediatric and adult patients undergoing a wide spectrum of radiologic sedation, analgesia and contrast media complications. Materials and methods. The computerized simulator has a comprehensive set of physiologic and pharmacologic models that predict patient response to management of sedation, analgesia, and contrast-media complications. Photorealistic images, real-time monitors, and mouse-driven information demonstrate in a virtual-reality fashion the behavior of the patient in crisis. Results. Thirteen pediatric and adult radiology scenarios are illustrated encompassing areas such as pediatric radiology, neuroradiology, interventional radiology, and body imaging. The multiple case scenarios evaluate randomly the diagnostic and management performance of the radiologist in critical incidents such as oversedation, anaphylaxis, aspiration, airway obstruction, apnea, agitation, bronchospasm, hypotension, hypertension, cardiac arrest, bradycardia, tachycardia, and myocardial ischemia. The user must control the airway, breathing and circulation, and administer medications in a timely manner to save the simulated patient. On-line help is available in the program to suggest diagnostic and treatment steps to save the patient, and provide information about the medications. A printout of the case management can be obtained for evaluation or educational purposes. Conclusion. The interactive computerized simulator is a new approach to train and evaluate radiologists' responses to critical incidents encountered during radiologic sedation, analgesia, and contrast-media administration. (orig.)

  1. Computers in radiology. The sedation, analgesia, and contrast media computerized simulator: a new approach to train and evaluate radiologists' responses to critical incidents

    International Nuclear Information System (INIS)

    Medina, L.S.; Racadio, J.M.; Schwid, H.A.

    2000-01-01

    Background. Awareness and preparedness to handle sedation, analgesia, and contrast-media complications are key in the daily radiology practice. Objective. The purpose is to create a computerized simulator (PC-Windows-based) that uses a graphical interface to reproduce critical incidents in pediatric and adult patients undergoing a wide spectrum of radiologic sedation, analgesia and contrast media complications. Materials and methods. The computerized simulator has a comprehensive set of physiologic and pharmacologic models that predict patient response to management of sedation, analgesia, and contrast-media complications. Photorealistic images, real-time monitors, and mouse-driven information demonstrate in a virtual-reality fashion the behavior of the patient in crisis. Results. Thirteen pediatric and adult radiology scenarios are illustrated encompassing areas such as pediatric radiology, neuroradiology, interventional radiology, and body imaging. The multiple case scenarios evaluate randomly the diagnostic and management performance of the radiologist in critical incidents such as oversedation, anaphylaxis, aspiration, airway obstruction, apnea, agitation, bronchospasm, hypotension, hypertension, cardiac arrest, bradycardia, tachycardia, and myocardial ischemia. The user must control the airway, breathing and circulation, and administer medications in a timely manner to save the simulated patient. On-line help is available in the program to suggest diagnostic and treatment steps to save the patient, and provide information about the medications. A printout of the case management can be obtained for evaluation or educational purposes. Conclusion. The interactive computerized simulator is a new approach to train and evaluate radiologists' responses to critical incidents encountered during radiologic sedation, analgesia, and contrast-media administration. (orig.)

  2. Research of predictive factors for cardiac resynchronization therapy: a prospective study comparing data from phase-analysis of gated myocardial perfusion single-photon computed tomography and echocardiography : Trying to anticipate response to CRT.

    Science.gov (United States)

    Gendre, Rémy; Lairez, O; Mondoly, P; Duparc, A; Carrié, D; Galinier, M; Berry, I; Cognet, T

    2017-04-01

    Cardiac resynchronization therapy (CRT) reduces morbidity and mortality in chronic systolic heart failure. About 20% of implanted patients are considered as "non-responders". This study aimed to evaluate gated myocardial perfusion single-photon emission computed tomography (GMPS) phase parameters as compared to echocardiography in the assessment of predictors for response to CRT before and after CRT activation. Forty-two patients were prospectively included during 15 months. A single injection of 99m Tc-tetrofosmin was used to acquire GMPS phase pre- and post-CRT activation. Indicators of positive CRT response were improvement of functional status and 15% reduction in left ventricular end-systolic volume at 3 months. Phase parameters at baseline were similar in the two groups with no influence of perfusion data. Phase parameters after CRT activation were significantly improved in the responders' group (Δ Bandwidth -19° ± 24° vs. 13° ± 31°, p = 0.001; Δ SD -20° ± 30° vs. 26° ± 46°, p = 0.001; Δ Entropy -11 ± 12 vs. 2 ± 6%, p = 0.001). Feasibility and reproducibility were higher for GMPS. Acute phase modifications after CRT activation may predict response to CRT immediately after implantation, but not at baseline, even when adjusted to perfusion data.

  3. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  4. Validation of computer-administered clinical rating scale: Hamilton Depression Rating Scale assessment with Interactive Voice Response technology--Japanese version.

    Science.gov (United States)

    Kunugi, Hiroshi; Koga, Norie; Hashikura, Miyako; Noda, Takamasa; Shimizu, Yu; Kobayashi, Takayuki; Yamanaka, Jun; Kanemoto, Noriaki; Higuchi, Teruhiko

    2013-05-01

    The aim of this study was to examine the reliability and validity of the Interactive Voice Response (IVR) program to rate the 17-item Hamilton Rating Scale for Depression (HAM-D) score in Japanese depressive patients. Depression severity was assessed in 60 patients by a clinician and psychologists using HAM-D. Scoring by the IVR program was conducted on the same and the following days. Test-retest reliability, internal consistency, and concurrent validity for total HAM-D scores were examined by calculating intraclass correlation coefficient, Cronbach's alpha, and Pearson's correlation coefficient. Inter-rater consistency for each HAM-D item was examined by Cohen's kappa. Test-retest reliability of the IVR program was high (intraclass correlation coefficient: 0.93). Internal consistency of each total score obtained by the clinician, psychologists, and IVR program was high (Cronbach's alpha: 0.77, 0.79, 0.78, and 0.83). Regarding concurrent validity, correlation coefficients between total scores obtained by the clinician versus IVR and that by the clinician versus psychologists were high (0.81 and 0.93). The HAM-D total score rated by the clinician was 3 points lower than that of IVR. Inter-rater consistency for each HAM-D item evaluated by the clinician versus IVR was estimated to be fair (Cohen's kappa coefficient: 0.02-0.50). Our results suggest that the Japanese IVR HAM-D program is reliable and valid to assess 17-item HAM-D total score in Japanese depressive patients. However, the current program tends to overestimate depression severity, and the score of each item did not always show high agreement with clinician's rating, which warrants further improvement in the program. © 2013 The Authors. Psychiatry and Clinical Neurosciences © 2013 Japanese Society of Psychiatry and Neurology.

  5. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  6. GADRAS Detector Response Function.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Dean J.; Harding, Lee; Thoreson, Gregory G; Horne, Steven M.

    2014-11-01

    The Gamma Detector Response and Analysis Software (GADRAS) applies a Detector Response Function (DRF) to compute the output of gamma-ray and neutron detectors when they are exposed to radiation sources. The DRF is fundamental to the ability to perform forward calculations (i.e., computation of the response of a detector to a known source), as well as the ability to analyze spectra to deduce the types and quantities of radioactive material to which the detectors are exposed. This document describes how gamma-ray spectra are computed and the significance of response function parameters that define characteristics of particular detectors.

  7. Computational hydraulics

    Science.gov (United States)

    Brebbia, C. A.; Ferrante, A. J.

    Computational hydraulics is discussed in detail, combining classical hydraulics with new methods such as finite elements and boundary elements, both presented in a matrix formulation. The basic properties and concepts of fluids are first reviewed, and pipe flow is treated, giving empirical formulae. Aspects of pipe networks are covered, including energy losses, total systems of equations and their solution, linear and nonlinear analyses and computer programs. Open-channel flow is treated, including Chezy and Manning formulae, optimum hydraulic section, nonuniform flow, and analysis and computation. Potential flow is addressed, including the application of Euler's equations, flow nets, finite element and boundary element solutions and programs. The applications of Navier-Stokes equations to Newtonian fluids and turbulence is considered. Finally, turbomachinery is discussed.

  8. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  9. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  10. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  11. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  12. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  13. Computational vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  14. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  15. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  16. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  17. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  19. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  20. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  1. Computational screening of Six Antigens for potential MHC class II restricted epitopes and evaluating its CD4+ T-Cell Responsiveness against Visceral Leishmaniasis

    Directory of Open Access Journals (Sweden)

    Manas Ranjan

    2017-12-01

    Full Text Available Visceral leishmaniasis is one of the most neglected tropical diseases for which no vaccine exists. In spite of extensive efforts, no successful vaccine is available against this dreadful infectious disease. To support the vaccine development, immunoinformatics approach was applied to search for potential MHC-classII restricted epitopes that can activate the immune cells. Initially, a total of 37 epitopes derived from six, stage dependent over expressed antigens were predicted, which were presented by at least 26 diverse MHC class II alleles including: DRB10101, DRB10301, DRB10401, DRB10404, DRB10405, DRB10701, DRB10802, DRB10901, DRB11101, DRB11302, DRB11501, DRB30101, DRB40101, DRB50101, DPA10103-DPB10401, DPA10103-DPB10201, DPA10201-DPB10101, DPA10103-DPB10301_DPB10401, DPA10301-DPB10402, DPA10201-DPB105021, DQA10102-DQB10602, DQA10401-DQB10402, DQA10501-QB10201, DQA10501-DQB10301, DQA10301-DQB10302 and DQA10101-DQB10501. Based on the population coverage analysis and HLA cross presentation ability, six epitopes namely, FDLFLFSNGAVVWWG (P1, YPVYPFLASNAALLN (P2, VYPFLASNAALLNLI (P3, LALLIMLYALIATQF (P4, LIMLYALIATQFSDD (P5, IMLYALIATQFSDDA (P6 were selected for further analysis. Stimulation with synthetic peptide alone or as a cocktail triggered the intracellular IFN-γ production. Moreover, specific IgG class of antibodies was detected in the serum of active VL cases against P1, P4, P and P6 in order to evaluate peptide effect on humoral immune response. Additionally, most of the peptides, except P2, were found to be non-inducer of CD4+ IL-10 against both active VL as well as treated VL subjects. Peptide immunogenicity was validated in BALB/c mice immunized with cocktail of synthetic peptide emulsified in complete Freund’s adjuvant/incomplete Freund’s adjuvant. The immunized splenocytes induced strong spleen cell proliferation upon parasite re-stimulation. Furthermore, an increased IFN-γ, IL-12, IL-17 and IL-22 production augmented with

  2. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  3. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  4. Optical Computing

    Indian Academy of Sciences (India)

    (For example, the Japanese Earth Simu- lator, a computer system developed by NEC, uses a ..... quite similar to the one shown in Figure 1, except that the phthalocyanine film was replaced by a hollow fiber ... and hence funds were provided accordingly. The areas of space exploration, earth resource utilization, communi-.

  5. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  6. Statistical Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 10. Statistical Computing - Understanding Randomness and Random Numbers. Sudhakar Kunte. Series Article Volume 4 Issue 10 October 1999 pp 16-21. Fulltext. Click here to view fulltext PDF. Permanent link:

  7. Quantum Computing

    Indian Academy of Sciences (India)

    It was suggested that the dynamics of quantum systems could be used to perform computation in a much more efficient way. After this initial excitement, things slowed down for some time till 1994 when Peter Shor announced his polynomial time factorization algorithm 1 which uses quantum dynamics. The study of quantum ...

  8. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  9. Optical Computing

    Indian Academy of Sciences (India)

    Debabrata Goswami is at the. Tata Institute of Fundamen- tal Research, Mumbai, where he explores the applications of ultrafast shaped pulses to coherent control, high-speed communication and computing. He is also associated as a Visiting. Faculty at liT, Kanpur, where he will be teaching a new course on Quantum.

  10. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  11. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  12. Optical Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 6. Optical Computing - Optical Components and Storage Systems. Debabrata Goswami. General Article Volume 8 Issue 6 June 2003 pp 56-71. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Optical Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 7. Optical Computing - Research Trends. Debabrata Goswami. General Article Volume 8 Issue 7 July 2003 pp 8-21. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/008/07/0008-0021. Keywords.

  14. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Quantum Computing

    Indian Academy of Sciences (India)

    quantum dynamics. The study of quantum systems for computation has come into its own since then. In this article we will look at a few concepts which make this framewor k so powerful. 2. Quantum Physics Basics. Consider an electron (say, in a H atom) with two energy levels (ground state and one excited state). In general ...

  16. Targeted concurrent chemoradiotherapy, by using improved microcapsules that release carboplatin in response to radiation, improves detectability by computed tomography as well as antitumor activity while reducing adverse effect in vivo.

    Science.gov (United States)

    Harada, Satoshi; Ehara, Shigeru; Ishii, Keizo; Sato, Takahiro; Koka, Masashi; Kamiya, Tomihiro; Sera, Koichiro; Goto, Shyoko

    2015-03-01

    The effect of alginate-hyaluronate microcapsules that release carboplatin in response to radiation was improved by adding ascorbic acid (AA). Four measures of the effectiveness of the microcapsules were evaluated: 1) release of carboplatin in response to radiation in vitro and in vivo; 2) detectability of their accumulation by computed tomography (CT) in vivo; 3) enhancement of antitumor effects in vivo; and 4) reduction of adverse effects in vivo. There were significant increases in the rupture of microcapsules by adding AA in vitro. Subcutaneously injected microcapsules around the tumor could be detected by using CT and the alteration of CT-values correlated with the accumulation of the microcapsules. Those microcapsules released carboplatin and resulted in synergistic antitumor effect with concomitant radiation. With the encapsulation of carboplatin, chemotherapeutic effects were still observed two weeks after treatment. However, addition of AA did not result in increased antitumor effect in vivo. A reduction in adverse effects was observed with the encapsulation of carboplatin, through localization of carboplatin around the tumor. Addition of AA to the materials of microcapsules did not result in increasing antitumor effect. However encapsulation of carboplatin will be useful as a clinical cancer-therapy option. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  17. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  18. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  19. 90Y microsphere (TheraSphere) treatment for unresectable colorectal cancer metastases of the liver: response to treatment at targeted doses of 135-150 Gy as measured by [18F]fluorodeoxyglucose positron emission tomography and computed tomographic imaging.

    Science.gov (United States)

    Lewandowski, Robert J; Thurston, Kenneth G; Goin, James E; Wong, Ching-Yee O; Gates, Vanessa L; Van Buskirk, Mark; Geschwind, Jean-Francois H; Salem, Riad

    2005-12-01

    The purpose of this phase II study was to determine the safety and efficacy of TheraSphere treatment (90Y microspheres) in patients with liver-dominant colorectal metastases in whom standard therapies had failed or were judged to be inappropriate. Twenty-seven patients with unresectable hepatic colorectal metastases were treated at a targeted absorbed dose of 135-150 Gy. Safety and toxicity were assessed according to the National Cancer Institute's Common Toxicity Criteria, version 3.0. Response was assessed with use of computed tomography (CT) and was correlated with response on [18F]fluorodeoxyglucose (FDG) positron emission tomography (PET). Survival from first treatment was estimated with use of the Kaplan-Meier method. Tumor response measured by FDG PET imaging exceeded that measured by CT imaging for the first (88% vs 35%) and second (73% vs 36%) treated lobes. Tumor replacement of 25% or less (vs >25%) was associated with a statistically significant increase in median survival (339 days vs 162 days; P = .002). Treatment-related toxicities included mild fatigue (n = 13; 48%), nausea (n = 4; 15%), and vague abdominal pain (n = 5; 19%). There was one case of radiation-induced gastritis from inadvertent deposition of microspheres to the gastrointestinal tract (n = 1; 4%). Three patients (11%) experienced ascites/pleural effusion after treatment with TheraSphere as a consequence of liver failure in advanced-stage metastatic disease. With the exception of these three patients whose sequelae were not considered to be related to treatment, all observed toxicities were transient and resolved without medical intervention. TheraSphere administration appears to provide stabilization of liver disease with minimal toxicity in patients in whom standard systemic chemotherapy regimens have failed.

  20. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  1. Spatial Computation

    Science.gov (United States)

    2003-12-01

    2001), Las Vegas, June 2001. [BRM+99] Jonathan Babb, Martin Rinard, Csaba Andras Moritz, Walter Lee, Matthew Frank Rajeev Barua, and Saman...Springer Verlag. [CA88] David E. Culler and Arvind. Resource requirements of dataflow programs. In International Symposium on Computer Architecture...Rajeev Barua, Matthew Frank, Devabhaktuni Srikrishna, Jonathan Babb, Vivek Sarkar, and Saman Amarasinghe. Space-time scheduling of instruction-level

  2. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  3. Readers' Survey Results: What is Computer Literacy?

    Science.gov (United States)

    Classroom Computer Learning, 1986

    1986-01-01

    Readers of Classroom Computer Learning were asked for a definition of computer literacy; a summary of their responses is provided. Three opinions offered are that computer literacy instruction should begin early, that it should be made compulsory, and instruction should also focus on use of tool programs and education software. (JN)

  4. Considering Thin Client Computing for Higher Education.

    Science.gov (United States)

    Sheehan, Mark

    1998-01-01

    In response to concerns about the cost of keeping up with individual desktop computing technology, several new solutions have emerged. Referred to as "thin clients," or network-centric computers, they include two types of desktop device: the network computer and the Windows terminal. Purchase cost, life span, support costs, and overall total cost…

  5. Computer applications in veterinary medicine | Hassan | Nigerian ...

    African Journals Online (AJOL)

    Computers have become essential tools in almost every field of research and applied technology. The advent of the micro-computers allows us as veterinarians enter and analyze vast amount of data on animal health, production and administrative responsibilities. Computers in veterinary medicine have been used for ...

  6. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  7. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  8. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    papers [1,2]. In [1] we assume that the adversary can corrupt any set from a given adversary structure. In this setting we study a problem of doing efficient VSS and MPC given an access to a secret sharing scheme (SS). For all adversary structures where VSS is possible at all, we show that, up...... here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority...

  9. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  10. Privacy and legal issues in cloud computing

    CERN Document Server

    Weber, Rolf H

    2015-01-01

    Adopting a multi-disciplinary and comparative approach, this book focuses on emerging and innovative attempts to tackle privacy and legal issues in cloud computing, such as personal data privacy, security and intellectual property protection. Leading international academics and practitioners in the fields of law and computer science examine the specific legal implications of cloud computing pertaining to jurisdiction, biomedical practice and information ownership. This collection offers original and critical responses to the rising challenges posed by cloud computing.

  11. Future computing needs for Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should

  12. Fast computation of Krawtchouk moments

    Czech Academy of Sciences Publication Activity Database

    Honarvar Shakibaei Asli, B.; Flusser, Jan

    2014-01-01

    Roč. 288, č. 1 (2014), s. 73-86 ISSN 0020-0255 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Krawtchouk polynomial * Krawtchouk moment * Geometric moment * Impulse response * Fast computation * Digital filter Subject RIV: JD - Computer Applications, Robotics Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0432452.pdf

  13. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  14. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  15. The social impact of computers

    CERN Document Server

    Rosenberg, Richard S

    1992-01-01

    The Social Impact of Computers should be read as a guide to the social implications of current and future applications of computers. Among the basic themes presented are the following: the changing nature of work in response to technological innovation as well as the threat to jobs; personal freedom in the machine age as manifested by challenges to privacy, dignity, and work; the relationship between advances in computer and communications technology and the possibility of increased centralization of authority; and the emergence and influence of artificial intelligence and its role in decision

  16. Research in Computer Forensics

    Science.gov (United States)

    2002-06-01

    3 D. WHAT IS COMPUTER FORENSICS ..........................................................6 E. SURVEY OF AGENCIES AND VENDORS PROVIDING COMPUTER...lead to the formulation of computer forensic material for a potential Computer Forensic Course at NPS. 6 D. WHAT IS COMPUTER FORENSICS...Individualization 8. Reconstruction 63 What is Computer Forensics? Computer Forensics involves the identification, extraction, preservation and

  17. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  18. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  19. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  20. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  1. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  2. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  3. Task-Driven Computing

    National Research Council Canada - National Science Library

    Wang, Zhenyu

    2000-01-01

    .... They will want to use the resources to perform computing tasks. Today's computing infrastructure does not support this model of computing very well because computers interact with users in terms of low level abstractions...

  4. Experimental DNA computing

    NARCIS (Netherlands)

    Henkel, Christiaan

    2005-01-01

    Because of their information storing and processing capabilities, nucleic acids are interesting building blocks for molecular scale computers. Potential applications of such DNA computers range from massively parallel computation to computational gene therapy. In this thesis, several implementations

  5. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  6. Computed radiography

    International Nuclear Information System (INIS)

    Itoh, Hiroshi

    1987-01-01

    In an effort to evaluate the feasibility of introducing computed radiography (FCR) into mass screening for lung cancer, the ability of FCR to detect nodules one cm in diameter was examined using a humanoid chest phantom. Based on the receiver operating characteristic (ROC) analysis, the detectability of FCR was compared with that of conventional radiography and photofluorography. The values of area under ROC curves were higher for FCR (0.963 for image similar to that with conventional film-intensifying screen system, image A; and 0.952 for processed image, image B) than the other two methods (0.774 for radiography and 0.789 for photofluorography). Degradation of image quality in FCR could be avoided by a wide latitude even if proper exposure techniques might not be employed. Images A and B in FCR yielded excellent delineation for nodules in the lung field and in the retrocardiac and subdiaphragmatic regions, respectively. This may have implications for the value of simultaneous interpretation of both images in increasing diagnostic accuracy. Structured noise of the ribs and blood vessels had scarcely an effect on nodule detectability in FCR. Radiation dose could be reduced to one third of the standard dose. It can thus be concluded that FCR is feasible in mass screening for lung cancer in terms of increased diagnostic ability and low radiation doses. (Namekawa, K.)

  7. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  8. The impact of irradiation dose on the computed tomography radiographic response of metastatic nodes and clinical outcomes in cervix cancer in a low-resource settingLetter to the Editor

    Directory of Open Access Journals (Sweden)

    Matthew Ryan McKeever

    2017-01-01

    Full Text Available Introduction: The aim of this study is to investigate the relationship between the radiation dose to pelvic and para-aortic lymph nodes, nodal response, and clinical outcomes in a resource-poor setting based on computed tomography (CT nodal size alone. Materials and Methods: This retrospective study from 2009 to 2015 included 46 cervical cancer patients with 133 metastatic pelvic and para-aortic lymph nodes definitively treated with chemoradiation and brachytherapy in a public hospital with limited access to positron emission tomography (PET scans. Hence, short axis of the lymph node on CT scan was used as a measure of metastatic nodal disease, before and following radiation therapy. Inclusion criteria required the pelvic and para-aortic nodes to have the shortest axis diameter on CT scan of ≥8 mm and ≥10 mm, respectively. Based on PET resolution, a node that decreased to half of its inclusion cutoff size was considered to have a complete response (CR. Relevant clinical outcomes were documented and correlated with nodal features, nodal radiation doses, and treatment characteristics. Results: After controlling for other predictive factors, increased nodal dose was associated with increased probability of CR per study definition (P = 0.005. However, there was no statistically significant association between dose and pelvic/para-aortic, distant and total recurrence (TR, and any recurrence at any location (P = 0.263, 0.785, 1.00, respectively. Patients who had no CR nodes had shorter pelvic/para-aortic recurrence-free survival (PPRFS and TR-free survival (TRFS than patients who had at least one CR node (P = 0.027 and 0.046, respectively. Patients with no CR nodes also had shorter PPRFS than patients who had all nodes completely respond (P < 0.05. Conclusions: Using CT-based measures, we found that increased nodal dose is associated with an increased probability of CR (as defined and nodal CR is associated with increased PPRFS and TRFS. We were

  9. Computational technologies advanced topics

    CERN Document Server

    Vabishchevich, Petr N

    2015-01-01

    This book discusses questions of numerical solutions of applied problems on parallel computing systems. Nowadays, engineering and scientific computations are carried out on parallel computing systems, which provide parallel data processing on a few computing nodes. In constructing computational algorithms, mathematical problems are separated in relatively independent subproblems in order to solve them on a single computing node.

  10. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  11. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  12. Prediction of tumor response after neoadjuvant chemoradiotherapy in rectal cancer using (18)fluorine-2-deoxy-D-glucose positron emission tomography-computed tomography and serum carcinoembryonic antigen: a prospective study.

    Science.gov (United States)

    Li, Qi-Wen; Zheng, Rong-Liang; Ling, Yi-Hong; Wang, Qiao-Xuan; Xiao, Wei-Wei; Zeng, Zhi-Fan; Fan, Wei; Li, Li-Ren; Gao, Yuan-Hong

    2016-08-01

    To investigate the association between (18)fluorine-2-deoxy-D-glucose positron emission tomography-computed tomography ((18)F-FDG PET/CT) parameters, serum carcinoembryonic antigen (CEA), and tumor response in patients with rectal cancer receiving neoadjuvant chemoradiotherapy (nCRT). Sixty-four patients with T3-4 and/or node-positive rectal cancer receiving nCRT followed by surgery were prospectively studied. PET/CT was performed before, and in 28 patients, both before and after nCRT. The pre-/post-nCRT maximum standardized uptake (SUVmax) values, differences between pre-/post-nCRT SUVmax (∆SUVmax), response index of SUVmax (RI-SUVmax), mean standardized uptake value (SUVmean), metabolic tumor volume (MTV), total lesion glycolysis (TLG), and CEA were measured. The ability of PET/CT parameters and CEA to predict Mandard's tumor regression grade (TRG) and pathological complete remission (pCR) were evaluated. 31 patients were identified as responders (TRG 1-2), and 19 exhibited pCR. For responders, significant differences were found for ΔSUVmax (24.88 vs. 15.39 g/ml, p = 0.037), RI-SUVmax (0.76 vs. 0.63, p = 0.025), ΔSUVmean (14.43 vs. 8.65 g/ml, p = 0.029), RI-SUVmean (0.77 vs. 0.63, p = 0.011), CEA-pre (6.30 vs. 27.86 μg/L, p < 0.001), CEA-post (2.22 vs. 5.49 μg/L, p = 0.002), ΔCEA (4.08 vs. 23.13 μg/L, p < 0.001), and RI-CEA (0.25 vs. 0.55, p = 0.002). Differences between pCR and non-pCR patients were noted as RI-SUVmean (0.77 vs. 0.65, p = 0.043), MTV-pre (9.87 vs. 14.62 cm(3), p = 0.045), CEA-pre (5.62 vs. 22.27 μg/L, p = 0.002), CEA-post (1.95 vs. 4.72 μg/L, p = 0.001), and ΔCEA (3.68 vs. 17.99 μg/L, p = 0.013). Receiver operating characteristic analysis revealed that RI-SUVmean exhibited the greatest accuracy in predicting responders, whereas CEA-post and ΔCEA exhibited the greatest accuracy in predicting pCR. (18)F-FDG PET/CT parameters and CEA are accurate tools for predicting tumor response to nCRT in rectal

  13. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  14. The Diffraction Response Interpolation Method

    DEFF Research Database (Denmark)

    Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Pedersen, Peder C.

    1998-01-01

    Computer modeling of the output voltage in a pulse-echo system is computationally very demanding, particularly whenconsidering reflector surfaces of arbitrary geometry. A new, efficient computational tool, the diffraction response interpolationmethod (DRIM), for modeling of reflectors in a fluid...... medium, is presented. The DRIM is based on the velocity potential impulseresponse method, adapted to pulse-echo applications by the use of acoustical reciprocity. Specifically, the DRIM operates bydividing the reflector surface into planar elements, finding the diffraction response at the corners...

  15. Distributed computing at the SSCL

    International Nuclear Information System (INIS)

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given

  16. New computing paradigms suggested by DNA computing: computing by carving.

    Science.gov (United States)

    Manca, V; Martín-Vide, C; Păun, G

    1999-10-01

    Inspired by the experiments in the emerging area of DNA computing, a somewhat unusual type of computation strategy was recently proposed by one of us: to generate a (large) set of candidate solutions of a problem, then remove the non-solutions such that what remains is the set of solutions. This has been called a computation by carving. This idea leads both to a speculation with possible important consequences--computing non-recursively enumerable languages--and to interesting theoretical computer science (formal language) questions.

  17. Computer Literacy Education

    Science.gov (United States)

    1989-01-01

    curricula, systems must reorder their priorities.ඇ One question for computer-literacy advocates is this: What is computer literacy more important...Context." AEDS Journal, 17, 3 (Spring 1984) 1-13. "Reader’s Survey Results: What Is Computer Literacy?" Classroom Computer Learning (March 1986) p. 53...Acquisition of Computer Literacy." Journal of Computer-Based Information, 12, 1 (Winter 1985) 12-16. "\\ What is Computer Literacy?" Article 10c in Cannings

  18. Designing with an underdeveloped computational composite for materials experience

    NARCIS (Netherlands)

    Barati, B.; Karana, E.; Hekkert, P.P.M.; Jönsthövel, I.

    2015-01-01

    In response to the urge for multidisciplinary development of computational composites, designers and material scientists are increasingly involved in collaborative projects to valorize these technology-push materials in the early stages of their development. To further develop the computational

  19. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  20. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  1. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  2. Information and Computation

    OpenAIRE

    Gershenson, Carlos

    2013-01-01

    In this chapter, concepts related to information and computation are reviewed in the context of human computation. A brief introduction to information theory and different types of computation is given. Two examples of human computation systems, online social networks and Wikipedia, are used to illustrate how these can be described and compared in terms of information and computation.

  3. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  4. Computer-Assisted Criterion-Referenced Measurement.

    Science.gov (United States)

    Ferguson, Richard L.

    A model for computer-assisted branched testing was developed, implemented, and evaluated in the context of an elementary school using the system of Individually Prescribed Instruction. A computer was used to generate and present items and then score the student's constructed response. Using Wald's sequential probability ratio test, the computer…

  5. Hyperswitch Network For Hypercube Computer

    Science.gov (United States)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  6. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special ... the Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a CT ...

  7. Quantum computation with superconductors

    OpenAIRE

    Irastorza Gabilondo, Amaia

    2017-01-01

    Quantum computation using supercoducting qubits. Qubits are quantum bits used in quantum computers. Superconducting qubits are a strong option for building a quantum computer. But not just that, as they are macroscopic objects they question the limits of quantum physics.

  8. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  9. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  10. Computers and the landscape

    Science.gov (United States)

    Gary H. Elsner

    1979-01-01

    Computers can analyze and help to plan the visual aspects of large wildland landscapes. This paper categorizes and explains current computer methods available. It also contains a futuristic dialogue between a landscape architect and a computer.

  11. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  12. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  13. Computational Physics Across the Disciplines

    Science.gov (United States)

    Crespi, Vincent; Lammert, Paul; Engstrom, Tyler; Owen, Ben

    2011-03-01

    In this informal talk, I will present two case studies of the unexpected convergence of computational techniques across disciplines. First, the marriage of neutron star astrophysics and the materials theory of the mechanical and thermal response of crystalline solids. Although the lower reaches of a neutron star host exotic nuclear physics, the upper few meters of the crust exist in a regime that is surprisingly amenable to standard molecular dynamics simulation, albeit in a physical regime of density order of magnitude of orders of magnitude different from those familiar to most condensed matter folk. Computational results on shear strength, thermal conductivity, and other properties here are very relevant to possible gravitational wave signals from these sources. The second example connects not two disciplines of computational physics, but experimental and computational physics, and not from the traditional direction of computational progressively approaching experiment. Instead, experiment is approaching computation: regular lattices of single-domain magnetic islands whose magnetic microstates can be exhaustively enumerated by magnetic force microscopy. There resulting images of island magnetization patterns look essentially like the results of Monte Carlo simulations of Ising systems... statistical physics with the microstate revealed.

  14. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  15. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  16. Computing technology in the 1980's. [computers

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  17. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  18. Quantum computational supremacy.

    Science.gov (United States)

    Harrow, Aram W; Montanaro, Ashley

    2017-09-13

    The field of quantum algorithms aims to find ways to speed up the solution of computational problems by using a quantum computer. A key milestone in this field will be when a universal quantum computer performs a computational task that is beyond the capability of any classical computer, an event known as quantum supremacy. This would be easier to achieve experimentally than full-scale quantum computing, but involves new theoretical challenges. Here we present the leading proposals to achieve quantum supremacy, and discuss how we can reliably compare the power of a classical computer to the power of a quantum computer.

  19. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  20. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  1. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  2. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  3. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  4. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  5. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  6. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  7. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  8. Evaluation of a computer model to simulate water table response to subirrigation Avaliação de um modelo computacional para simular a resposta do lençol freático à subirrigação

    Directory of Open Access Journals (Sweden)

    Jadir Aparecido Rosa

    2002-12-01

    Full Text Available The objective of this work was to evaluate the water flow computer model, WATABLE, using experimental field observations on water table management plots from a site located near Hastings, FL, USA. The experimental field had scale drainage systems with provisions for subirrigation with buried microirrigation and conventional seepage irrigation systems. Potato (Solanum tuberosum L. growing seasons from years 1996 and 1997 were used to simulate the hydrology of the area. Water table levels, precipitation, irrigation and runoff volumes were continuously monitored. The model simulated the water movement from a buried microirrigation line source and the response of the water table to irrigation, precipitation, evapotranspiration, and deep percolation. The model was calibrated and verified by comparing simulated results with experimental field observations. The model performed very well in simulating seasonal runoff, irrigation volumes, and water table levels during crop growth. The two-dimensional model can be used to investigate different irrigation strategies involving water table management control. Applications of the model include optimization of the water table depth for each growth stage, and duration, frequency, and rate of irrigation.O objetivo deste trabalho foi avaliar o modelo computacional WATABLE usando-se dados de campo obtidos em uma área experimental em manejo de lençol freático, localizada em Hastings, FL, EUA. Na área experimental, estavam instalados um sistema de drenagem e sistemas de irrigação por subsuperfície com irrigação localizada e por canais. Ciclos de cultivo de batata (Solanum tuberosum L., nos anos de 1996 e 1997, foram usados para a simulação da hidrologia da área. Profundidades do lençol freático, chuvas, irrigação e escorrimento superficial foram monitorados constantemente. O modelo simulou o movimento da água a partir de uma linha de irrigação localizada enterrada, e a resposta do nível do len

  9. Experimental quantum computing without entanglement.

    Science.gov (United States)

    Lanyon, B P; Barbieri, M; Almeida, M P; White, A G

    2008-11-14

    Deterministic quantum computation with one pure qubit (DQC1) is an efficient model of computation that uses highly mixed states. Unlike pure-state models, its power is not derived from the generation of a large amount of entanglement. Instead it has been proposed that other nonclassical correlations are responsible for the computational speedup, and that these can be captured by the quantum discord. In this Letter we implement DQC1 in an all-optical architecture, and experimentally observe the generated correlations. We find no entanglement, but large amounts of quantum discord-except in three cases where an efficient classical simulation is always possible. Our results show that even fully separable, highly mixed, states can contain intrinsically quantum mechanical correlations and that these could offer a valuable resource for quantum information technologies.

  10. Responsibility and Responsiveness

    DEFF Research Database (Denmark)

    Nissen, Ulrik Becker

    2011-01-01

    contemporary positions of communicative ethics, H. Richard Niebuhr’s understanding of responsibility as responsiveness, and Dietrich Bonhoeffer’s Christological concept of responsibility in a constructive dialogue with each other, the article has attempted to outline main tenets of a responsive concept...... in a differentiated unity with each other. This idea can be substantiated by a figurative appropriation of a Chalcedonian Christology, particularly the communicatio idiomatum. The communicative dimension of this concept has been found to be useful for a reinterpretation of the idea of responsibility. By engaging......The debate on the role and identity of Christian social ethics in liberal democracy touches upon the question about the relationship between universality and speci-ficity. Rather than argue for the difference between these approaches, it can be argued that they are to be understood...

  11. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  12. 8 bit computer

    OpenAIRE

    Jankovskij, Robert

    2018-01-01

    In this paper the author looks into an eight bit computer structure and the computers components, their structure, pros and cons. An eight bit computer which can execute basic instructions and arithmetic operations such as addition and subtraction of eight bit numbers is built out of integrated circuits. Data transfers between computer components are monitored and reviewed.

  13. The Glass Computer

    Science.gov (United States)

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  14. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  15. My Computer Romance

    Science.gov (United States)

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  16. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  17. Adolescents' Computer Art.

    Science.gov (United States)

    Clements, Robert D.

    1985-01-01

    Adolescents react very positively to computer graphics programs. The biggest obstacle to initiation of computer art in schools is teacher attitudes. Things to consider when starting a computer graphics program are discussed, and some illustrations of student computer art are provided. (RM)

  18. How Computer Graphics Work.

    Science.gov (United States)

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  19. Computational Social Sciences

    OpenAIRE

    Amaral, Inês

    2017-01-01

    Computational social sciences is a research discipline at the interface between computer science and the traditional social sciences. This interdisciplinary and emerging scientific field uses computationally methods to analyze and model social phenomena, social structures, and collective behavior. The main computational approaches to the social sciences are social network analysis, automated information extraction systems, social geographic information systems, comp...

  20. Marketers increase computer usage

    Energy Technology Data Exchange (ETDEWEB)

    1984-10-01

    A special study is presented on the use of computers in the fuel oil business. In 1984, 86% of the marketers used a computer and all of them used the computer for the billing. A large portion, 95%, used them to schedule delivery, and 91% used the computer to control credit. All of these percentages were similar to those for 1981.

  1. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  2. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  3. Hippocampal polysynaptic computation.

    Science.gov (United States)

    Kimura, Rie; Kang, Siu; Takahashi, Naoya; Usami, Atsushi; Matsuki, Norio; Fukai, Tomoki; Ikegaya, Yuji

    2011-09-14

    Neural circuitry is a self-organizing arithmetic device that converts input to output and thereby remodels its computational algorithm to produce more desired output; however, experimental evidence regarding the mechanism by which information is modified and stored while propagating across polysynaptic networks is sparse. We used functional multineuron calcium imaging to monitor the spike outputs from thousands of CA1 neurons in response to the stimulation of two independent sites of the dentate gyrus in rat hippocampal networks ex vivo. Only pyramidal cells were analyzed based on post hoc immunostaining. Some CA1 pyramidal cells were observed to fire action potentials only when both sites were simultaneously stimulated (AND-like neurons), whereas other neurons fired in response to either site of stimulation but not to concurrent stimulation (XOR-like neurons). Both types of neurons were interlaced in the same network and altered their logical operation depending on the timing of paired stimulation. Repetitive paired stimulation for brief periods induced a persistent reorganization of AND and XOR operators, suggesting a flexibility in parallel distributed processing. We simulated these network functions in silico and found that synaptic modification of the CA3 recurrent excitation is pivotal to the shaping of logic plasticity. This work provides new insights into how microscopic synaptic properties are associated with the mesoscopic dynamics of complex microcircuits.

  4. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  5. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  6. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  7. Empirically Assessing the Importance of Computer Skills

    Science.gov (United States)

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  8. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  9. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  10. Optically Controlled Quantum Dot Spins for Scaleable Quantum Computing

    National Research Council Canada - National Science Library

    Steel, Duncan G

    2006-01-01

    .... Sham is responsible for theoretical support & concept development. The group at Michigan along with this QuaCGR student are responsible for experimental demonstration of key experimental demonstrations for quantum computing...

  11. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  12. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  13. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  14. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  15. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society....

  16. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  17. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  18. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  19. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  20. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  2. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  3. UCI Computer Arts: Building Gender Equity while Meeting ISTE NETS.

    Science.gov (United States)

    Burge, Kimberly Bisbee

    Multimedia computer learning activities, when designed according to what is known about children's preferences, may help close the gender gap in attitudes about computer usage in schools. This paper includes: a brief overview of gender-gap research; a description of one response--the UCI (University of California Irvine) Computer Arts program,…

  4. 17 CFR 171.4 - Computation of time.

    Science.gov (United States)

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Computation of time. 171.4... MEMBER RESPONSIBILITY ACTIONS General Provisions § 171.4 Computation of time. (a) In general. In... computation unless the period of time prescribed or allowed is less than seven (7) days. (b) Date of service...

  5. Computer Use by School Teachers in Teaching-Learning Process

    Science.gov (United States)

    Bhalla, Jyoti

    2013-01-01

    Developing countries have a responsibility not merely to provide computers for schools, but also to foster a habit of infusing a variety of ways in which computers can be integrated in teaching-learning amongst the end users of these tools. Earlier researches lacked a systematic study of the manner and the extent of computer-use by teachers. The…

  6. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  7. Computer Literacy 10. Curriculum Guide=Informatique 10. Guide Pedagogique.

    Science.gov (United States)

    Alberta Dept. of Education, Edmonton.

    This curriculum guide provides information in both English and French for teaching the course, Computer Literacy 10, in the high schools of the Canadian province of Alberta. A basic introductory course developed in response to the need to acquaint high school students with a general understanding of computers and their use, Computer Literacy 10…

  8. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Science.gov (United States)

    2010-01-01

    ... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... responsible for computing the annual fuel price and inflation indices by using Equation II-1 and Equation II-2, respectively. The petitioner may compute the fuel price index specified in Equation II-1 or use his own price...

  9. Note on: 'EMDPLER: A F77 program for modeling the EM response of dipolar sources over the non-magnetic layer earth models' by N.P. Singh and T. Mogi, Computers & Geosciences 36 (2010) 430-440

    Science.gov (United States)

    Jamie, Majid; Mirzaei, Saeid; Mirzaei, Mahmoud

    2017-01-01

    In this paper some mistakes arising in Singh and Mogi (2010) that are: (1) wrong formulation of the intrinsic impedance of the layers of an N-layered earth (Zi) and reflection coefficient of the EM wave in TM-mode (rTM), (2) using wrong and the very same algorithms for computing reflection coefficients of the EM wave in both the TE- and the TM-mode (rTE and rTM) and (3) using flawed algorithms for computing phase and normalized phase values, relating to electric and magnetic components of the EM wave, are noted and corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect forward modeling results different two- and three-layered earth models, the same as the models used in Singh and Mogi (2010), are chosen; afterwards EMDPLER and corrected version of this program, presented in this paper titled "EMDPLER_Corr", are conducted on these models and real and imaginary parts of Hz and Hy components of the magnetic field intensity, their normalized amplitudes (|Hz /Hz0 |and|Hy /Hy0|) and the corresponding normalized phases are computed, plotted versus frequency and compared with each other.

  10. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  11. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  12. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  13. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  14. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  15. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  16. SICOEM: emergency response data system

    International Nuclear Information System (INIS)

    Martin, A.; Villota, C.; Francia, L.

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig

  17. SICOEM: emergency response data system

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.; Villota, C.; Francia, L. (UNESA, Madrid (Spain))

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig.

  18. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  19. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  20. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  1. Electrodermal Response in Gaming

    Directory of Open Access Journals (Sweden)

    J. Christopher Westland

    2011-01-01

    Full Text Available Steady improvements in technologies that measure human emotional response offer new possibilities for making computer games more immersive. This paper reviews the history of designs a particular branch of affective technologies that acquire electrodermal response readings from human subjects. Electrodermal response meters have gone through continual improvements to better measure these nervous responses, but still fall short of the capabilities of today's technology. Electrodermal response traditionally have been labor intensive. Protocols and transcription of subject responses were recorded on separate documents, forcing constant shifts of attention between scripts, electrodermal measuring devices and of observations and subject responses. These problems can be resolved by collecting more information and integrating it in a computer interface that is, by adding relevant sensors in addition to the basic electrodermal resistance reading to untangle (1 body resistance; (2 skin resistance; (3 grip movements; other (4 factors affecting the neural processing for regulation of the body. A device that solves these problems is presented and discussed. It is argued that the electrodermal response datastreams can be enriched through the use of added sensors and a digital acquisition and processing of information, which should further experimentation and use of the technology.

  2. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  3. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Videos About Us News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography ( ... cross-sectional images generated during a CT scan can be reformatted in multiple planes, and can even ...

  4. Computing for Belle

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  5. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  6. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  7. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  8. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  9. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  10. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  11. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  12. Computer Intrusions and Attacks.

    Science.gov (United States)

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  13. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  14. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  15. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  16. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  17. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  18. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  20. Clinical computing in general dentistry.

    Science.gov (United States)

    Schleyer, Titus K L; Thyvalikakath, Thankam P; Spallek, Heiko; Torres-Urquidy, Miguel H; Hernandez, Pedro; Yuhaniak, Jeannie

    2006-01-01

    Measure the adoption and utilization of, opinions about, and attitudes toward clinical computing among general dentists in the United States. Telephone survey of a random sample of 256 general dentists in active practice in the United States. A 39-item telephone interview measuring practice characteristics and information technology infrastructure; clinical information storage; data entry and access; attitudes toward and opinions about clinical computing (features of practice management systems, barriers, advantages, disadvantages, and potential improvements); clinical Internet use; and attitudes toward the National Health Information Infrastructure. The authors successfully screened 1,039 of 1,159 randomly sampled U.S. general dentists in active practice (89.6% response rate). Two hundred fifty-six (24.6%) respondents had computers at chairside and thus were eligible for this study. The authors successfully interviewed 102 respondents (39.8%). Clinical information associated with administration and billing, such as appointments and treatment plans, was stored predominantly on the computer; other information, such as the medical history and progress notes, primarily resided on paper. Nineteen respondents, or 1.8% of all general dentists, were completely paperless. Auxiliary personnel, such as dental assistants and hygienists, entered most data. Respondents adopted clinical computing to improve office efficiency and operations, support diagnosis and treatment, and enhance patient communication and perception. Barriers included insufficient operational reliability, program limitations, a steep learning curve, cost, and infection control issues. Clinical computing is being increasingly adopted in general dentistry. However, future research must address usefulness and ease of use, workflow support, infection control, integration, and implementation issues.

  1. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  3. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  4. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  5. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  6. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  7. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  8. Cluster State Quantum Computation

    Science.gov (United States)

    2014-02-01

    Computation Background In the standard quantum circuit model ( QCM ) paradigm, quantum computations are executed by successive unitary operations acting...overhead in the standard QCM to photon-based quantum computation. In the OWQC approach a quantum computation proceeds as follows: (i) a classical...the MBQC paradigm and its comparison/contrast with the usual QCM approach. Grover’s search algorithm (GSA) serves as an important prototypical

  9. Cluster State Quantum Computing

    Science.gov (United States)

    2012-12-01

    Computation Background In the standard Quantum Circuit Model ( QCM ) paradigm, quantum computations are executed by successive unitary operations acting upon...resource overhead in the standard QCM to photon-based quantum computation. In the OWQC approach a quantum computation proceeds as follows: (i) A classical...Grover’s search algorithm (GSA) on an unsorted list of elements in the MBQC paradigm, and its comparison/contrast with the usual QCM approach. GSA

  10. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  11. Computer Training at Harwell

    Science.gov (United States)

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  12. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  13. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  14. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  15. Education for Computers

    Science.gov (United States)

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  16. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  17. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  18. Optimizing Computer Technology Integration

    Science.gov (United States)

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  19. Computing at Belle II

    International Nuclear Information System (INIS)

    Kuhr, Thomas

    2011-01-01

    The next generation B-factory experiment Belle II will collect a huge data sample which is a challenge for the computing system. In this article, the computing model of the Belle II experiment is presented and the core components of the computing system are introduced.

  20. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Perfusion of the Head CT Angiography (CTA) Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography ( ...