WorldWideScience

Sample records for computing spatialimpulse responses

  1. Computational mechanics of nonlinear response of shells

    Energy Technology Data Exchange (ETDEWEB)

    Kraetzig, W.B. (Bochum Univ. (Germany, F.R.). Inst. fuer Statik und Dynamik); Onate, E. (Universidad Politecnica de Cataluna, Barcelona (Spain). Escuela Tecnica Superior de Ingenieros de Caminos) (eds.)

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs.

  2. Computational mechanics of nonlinear response of shells

    International Nuclear Information System (INIS)

    Kraetzig, W.B.; Onate, E.

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs

  3. Ethical Responsibility Key to Computer Security.

    Science.gov (United States)

    Lynn, M. Stuart

    1989-01-01

    The pervasiveness of powerful computers and computer networks has raised the specter of new forms of abuse and of concomitant ethical issues. Blurred boundaries, hackers, the Computer Worm, ethical issues, and implications for academic institutions are discussed. (MLW)

  4. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  5. Ethics and computing living responsibly in a computerized world

    CERN Document Server

    2001-01-01

    "Ethics and Computing, Second Edition promotes awareness of major issues and accepted procedures and policies in the area of ethics and computing using real-world companies, incidents, products and people." "Ethics and Computing, Second Edition is for topical undergraduate courses with chapters and assignments designed to encourage critical thinking and informed ethical decisions. Furthermore, this book will keep abreast computer science, computer engineering, and information systems professionals and their colleagues of current ethical issues and responsibilities."--Jacket.

  6. Computer Modeling of Thoracic Response to Blast

    Science.gov (United States)

    1988-01-01

    be solved at reasonable cost. intrathoracic pressure responses for subjects wearing In order to determine if the gas content of the sheep ballistic...spatial and temporal ries were compared with data. Two extreme cases had distribution of the load can be reasonably predicted by the rumen filled with...to the ap- is that sheep have large, multiple stomachs that have a proximate location where intrathoracic pressure meas- considerable air content . It

  7. Stimulus-response compatibility and affective computing: A review

    NARCIS (Netherlands)

    Lemmens, P.M.C.; Haan, A. de; Galen, G.P. van; Meulenbroek, R.G.J.

    2007-01-01

    Affective computing, a human–factors effort to investigate the merits of emotions while people are working with human–computer interfaces, is gaining momentum. Measures to quantify affect (or its influences) range from EEG, to measurements of autonomic–nervous–system responses (e.g., heart rate,

  8. Computer security incident response team effectiveness : A needs assessment

    NARCIS (Netherlands)

    Kleij, R. van der; Kleinhuis, G.; Young, H.J.

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad-hoc basis, in close cooperation with other teams, and in

  9. Computer incident response and forensics team management conducting a successful incident response

    CERN Document Server

    Johnson, Leighton

    2013-01-01

    Computer Incident Response and Forensics Team Management provides security professionals with a complete handbook of computer incident response from the perspective of forensics team management. This unique approach teaches readers the concepts and principles they need to conduct a successful incident response investigation, ensuring that proven policies and procedures are established and followed by all team members. Leighton R. Johnson III describes the processes within an incident response event and shows the crucial importance of skillful forensics team management, including when and where the transition to forensics investigation should occur during an incident response event. The book also provides discussions of key incident response components. Provides readers with a complete handbook on computer incident response from the perspective of forensics team management Identify the key steps to completing a successful computer incident response investigation Defines the qualities necessary to become a succ...

  10. Analytical predictions of SGEMP response and comparisons with computer calculations

    International Nuclear Information System (INIS)

    de Plomb, E.P.

    1976-01-01

    An analytical formulation for the prediction of SGEMP surface current response is presented. Only two independent dimensionless parameters are required to predict the peak magnitude and rise time of SGEMP induced surface currents. The analysis applies to limited (high fluence) emission as well as unlimited (low fluence) emission. Cause-effect relationships for SGEMP response are treated quantitatively, and yield simple power law dependencies between several physical variables. Analytical predictions for a large matrix of SGEMP cases are compared with an array of about thirty-five computer solutions of similar SGEMP problems, which were collected from three independent research groups. The theoretical solutions generally agree with the computer solutions as well as the computer solutions agree with one another. Such comparisons typically show variations less than a ''factor of two.''

  11. Prerequisites for building a computer security incident response capability

    CSIR Research Space (South Africa)

    Mooi, M

    2015-08-01

    Full Text Available . 1]. 2) Handbook for Computer Security Incident Response Teams (CSIRTs) [18] (CMU-SEI): Providing guidance on building and running a CSIRT, this handbook has a particular focus on the incident handling service [18, p. xv]. In addition, a basic CSIRT... stream_source_info Mooi_2015.pdf.txt stream_content_type text/plain stream_size 41092 Content-Encoding UTF-8 stream_name Mooi_2015.pdf.txt Content-Type text/plain; charset=UTF-8 Prerequisites for building a computer...

  12. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  13. Computations of nuclear response functions with MACK-IV

    International Nuclear Information System (INIS)

    Abdou, M.A.; Gohar, Y.

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications

  14. Computations of nuclear response functions with MACK-IV

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M A; Gohar, Y

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications.

  15. Therapy response evaluation with positron emission tomography-computed tomography.

    Science.gov (United States)

    Segall, George M

    2010-12-01

    Positron emission tomography-computed tomography with F-18-fluorodeoxyglucose is widely used for evaluation of therapy response in patients with solid tumors but has not been as readily adopted in clinical trials because of the variability of acquisition and processing protocols and the absence of universal response criteria. Criteria proposed for clinical trials are difficult to apply in clinical practice, and gestalt impression is probably accurate in individual patients, especially with respect to the presence of progressive disease and complete response. Semiquantitative methods of determining tissue glucose metabolism, such as standard uptake value, can be a useful descriptor for levels of tissue glucose metabolism and changes in response to therapy if technical quality control measures are carefully maintained. The terms partial response, complete response, and progressive disease are best used in clinical trials in which the terms have specific meanings and precise definitions. In clinical practice, it may be better to use descriptive terminology agreed upon by imaging physicians and clinicians in their own practice. Copyright © 2010. Published by Elsevier Inc.

  16. Computer Security Incident Response Team Effectiveness: A Needs Assessment.

    Science.gov (United States)

    Van der Kleij, Rick; Kleinhuis, Geert; Young, Heather

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  17. Computer Security Incident Response Team Effectiveness: A Needs Assessment

    Directory of Open Access Journals (Sweden)

    Rick Van der Kleij

    2017-12-01

    Full Text Available Computer security incident response teams (CSIRTs respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  18. Splitting method for computing coupled hydrodynamic and structural response

    International Nuclear Information System (INIS)

    Ash, J.E.

    1977-01-01

    A numerical method is developed for application to unsteady fluid dynamics problems, in particular to the mechanics following a sudden release of high energy. Solution of the initial compressible flow phase provides input to a power-series method for the incompressible fluid motions. The system is split into spatial and time domains leading to the convergent computation of a sequence of elliptic equations. Two sample problems are solved, the first involving an underwater explosion and the second the response of a nuclear reactor containment shell structure to a hypothetical core accident. The solutions are correlated with experimental data

  19. A discrete ordinate response matrix method for massively parallel computers

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1991-01-01

    A discrete ordinate response matrix method is formulated for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices which result from the diamond-differenced equations are utilized in a factored form which minimizes memory requirements and significantly reduces the required number of algorithm utilizes massive parallelism by assigning each spatial node to a processor. The algorithm is accelerated effectively by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red/black iterations. The method has been implemented on a 16k Connection Machine-2, and S 8 and S 16 solutions have been obtained for fixed-source benchmark problems in X--Y geometry

  20. Seismic response computations for a long span bridge

    International Nuclear Information System (INIS)

    McCallen, D.B.

    1994-01-01

    The authors are performing large-scale numerical computations to simulate the earthquake response of a major long-span bridge that crosses the San Francisco Bay. The overall objective of the study is to estimate the response of the bridge to potential large-magnitude earthquakes generated on the nearby San Andreas and Hayward earthquake faults. Generation of a realistic model of the bridge system is complicated by the existence of large pile group foundations that extend deep into soft, saturated clay soils, and by the numerous expansion joints that segment the overall bridge structure. In the current study, advanced, nonlinear, finite element technology is being applied to rigorously model the detailed behavior of the bridge system and to shed light on the influence of the foundations and joints of the bridge

  1. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  2. Experimental and computational investigation of lateral gauge response in polycarbonate

    Science.gov (United States)

    Eliot, Jim; Harris, Ernst; Hazell, Paul; Appleby-Thomas, Gareth; Winter, Ronald; Wood, David; Owen, Gareth

    2011-06-01

    Polycarbonate's use in personal armour systems means its high strain-rate response has been extensively studied. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges need to be embedded in a central (typically) epoxy interlayer - an inherently invasive approach. Recently, research has suggested that in such metal systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. This work extends previous efforts such that similar impedance exists between the interlayer and matrix (target) interface. Further, experiments and simulations are presented investigating the effects of a ``dry joint'' in polycarbonate, in which no encapsulating medium is employed.

  3. On computing the geoelastic response to a disk load

    Science.gov (United States)

    Bevis, M.; Melini, D.; Spada, G.

    2016-06-01

    We review the theory of the Earth's elastic and gravitational response to a surface disk load. The solutions for displacement of the surface and the geoid are developed using expansions of Legendre polynomials, their derivatives and the load Love numbers. We provide a MATLAB function called diskload that computes the solutions for both uncompensated and compensated disk loads. In order to numerically implement the Legendre expansions, it is necessary to choose a harmonic degree, nmax, at which to truncate the series used to construct the solutions. We present a rule of thumb (ROT) for choosing an appropriate value of nmax, describe the consequences of truncating the expansions prematurely and provide a means to judiciously violate the ROT when that becomes a practical necessity.

  4. Towards SSVEP-based, portable, responsive Brain-Computer Interface.

    Science.gov (United States)

    Kaczmarek, Piotr; Salomon, Pawel

    2015-08-01

    A Brain-Computer Interface in motion control application requires high system responsiveness and accuracy. SSVEP interface consisted of 2-8 stimuli and 2 channel EEG amplifier was presented in this paper. The observed stimulus is recognized based on a canonical correlation calculated in 1 second window, ensuring high interface responsiveness. A threshold classifier with hysteresis (T-H) was proposed for recognition purposes. Obtained results suggest that T-H classifier enables to significantly increase classifier performance (resulting in accuracy of 76%, while maintaining average false positive detection rate of stimulus different then observed one between 2-13%, depending on stimulus frequency). It was shown that the parameters of T-H classifier, maximizing true positive rate, can be estimated by gradient-based search since the single maximum was observed. Moreover the preliminary results, performed on a test group (N=4), suggest that for T-H classifier exists a certain set of parameters for which the system accuracy is similar to accuracy obtained for user-trained classifier.

  5. Cloud Computing in Support of Synchronized Disaster Response Operations

    Science.gov (United States)

    2010-09-01

    scalable, Web application based on cloud computing technologies to facilitate communication between a broad range of public and private entities without...requiring them to compromise security or competitive advantage. The proposed design applies the unique benefits of cloud computing architectures such as

  6. Effective Response to Attacks On Department of Defense Computer Networks

    National Research Council Canada - National Science Library

    Shaha, Patrick

    2001-01-01

    .... For the Commanders-in-Chief (CINCs), computer networking has proven especially useful in maintaining contact and sharing data with elements forward deployed as well as with host nation governments and agencies...

  7. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    Science.gov (United States)

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  8. Low-complexity computer simulation of multichannel room impulse responses

    NARCIS (Netherlands)

    Martínez Castañeda, J.A.

    2013-01-01

    The "telephone'' model has been, for the last one hundred thirty years, the base of modern telecommunications with virtually no changes in its fundamental concept. The arise of smaller and more powerful computing devices have opened new possibilities. For example, to build systems able to give to

  9. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  10. A note on probabilistic computation of earthquake response spectrum amplitudes

    International Nuclear Information System (INIS)

    Anderson, J.G.; Trifunac, M.D.

    1979-01-01

    This paper analyzes a method for computation of Pseudo Relative Velocity (PSV) spectrum and Absolute Acceleration (SA) spectrum so that the amplitudes and the shapes of these spectra reflect the geometrical characteristics of the seismic environment of the site. The estimated spectra also incorporate the geologic characteristics at the site, direction of ground motion and the probability of exceeding these motions. An example of applying this method in a realistic setting is presented and the uncertainties of the results are discussed. (Auth.)

  11. Computational Modeling of Micrometastatic Breast Cancer Radiation Dose Response

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Daniel L.; Debeb, Bisrat G. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Thames, Howard D. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A., E-mail: wwoodward@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2016-09-01

    Purpose: Prophylactic cranial irradiation (PCI) involves giving radiation to the entire brain with the goals of reducing the incidence of brain metastasis and improving overall survival. Experimentally, we have demonstrated that PCI prevents brain metastases in a breast cancer mouse model. We developed a computational model to expand on and aid in the interpretation of our experimental results. Methods and Materials: MATLAB was used to develop a computational model of brain metastasis and PCI in mice. Model input parameters were optimized such that the model output would match the experimental number of metastases per mouse from the unirradiated group. An independent in vivo–limiting dilution experiment was performed to validate the model. The effect of whole brain irradiation at different measurement points after tumor cells were injected was evaluated in terms of the incidence, number of metastases, and tumor burden and was then compared with the corresponding experimental data. Results: In the optimized model, the correlation between the number of metastases per mouse and the experimental fits was >95. Our attempt to validate the model with a limiting dilution assay produced 99.9% correlation with respect to the incidence of metastases. The model accurately predicted the effect of whole-brain irradiation given 3 weeks after cell injection but substantially underestimated its effect when delivered 5 days after cell injection. The model further demonstrated that delaying whole-brain irradiation until the development of gross disease introduces a dose threshold that must be reached before a reduction in incidence can be realized. Conclusions: Our computational model of mouse brain metastasis and PCI correlated strongly with our experiments with unirradiated mice. The results further suggest that early treatment of subclinical disease is more effective than irradiating established disease.

  12. Computational method for discovery of estrogen responsive genes

    DEFF Research Database (Denmark)

    Tang, Suisheng; Tan, Sin Lam; Ramadoss, Suresh Kumar

    2004-01-01

    Estrogen has a profound impact on human physiology and affects numerous genes. The classical estrogen reaction is mediated by its receptors (ERs), which bind to the estrogen response elements (EREs) in target gene's promoter region. Due to tedious and expensive experiments, a limited number of hu...

  13. Advanced Computational Modeling Approaches for Shock Response Prediction

    Science.gov (United States)

    Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee

    2015-01-01

    Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.

  14. Detailed comparison between computed and measured FBR core seismic responses

    International Nuclear Information System (INIS)

    Forni, M.; Martelli, A.; Melloni, R.; Bonacina, G.

    1988-01-01

    This paper presents a detailed comparison between seismic calculations and measurements performed for various mock-ups consisting of groups of seven and nineteen simplified elements of the Italian PEC fast reactor core. Experimental tests had been performed on shaking tables in air and water (simulating sodium) with excitations increasing up to above Safe Shutdown Earthquake. The PEC core-restraint ring had been simulated in some tests. All the experimental tests have been analysed by use of both the one-dimensional computer program CORALIE and the two-dimensional program CLASH. Comparisons have been made for all the instrumented elements, in both the time and the frequency domains. The good agreement between calculations and measurements has confirmed adequacy of the fluid-structure interaction model used for PEC core seismic design verification

  15. Peer-Allocated Instant Response (PAIR): Computional allocation of peer tutors in learning communities

    NARCIS (Netherlands)

    Westera, Wim

    2009-01-01

    Westera, W. (2007). Peer-Allocated Instant Response (PAIR): Computational allocation of peer tutors in learning communities. Journal of Artificial Societies and Social Simulation, http://jasss.soc.surrey.ac.uk/10/2/5.html

  16. Computed tomography assessment of early response to neoadjuvant therapy in colon cancer

    DEFF Research Database (Denmark)

    Dam, Claus; Lund-Rasmussen, Vera; Pløen, John

    2015-01-01

    INTRODUCTION: Using multidetector computed tomography, we aimed to assess the early response of neoadjuvant drug therapy for locally advanced colon cancer. METHODS: Computed tomography with IV contrast was acquired from 67 patients before and after up to three cycles of preoperative treatment. All...

  17. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  18. COMPUTATIONAL MODELING OF SIGNALING PATHWAYS MEDIATING CELL CYCLE AND APOPTOTIC RESPONSES TO IONIZING RADIATION MEDIATED DNA DAMAGE

    Science.gov (United States)

    Demonstrated of the use of a computational systems biology approach to model dose response relationships. Also discussed how the biologically motivated dose response models have only limited reference to the underlying molecular level. Discussed the integration of Computational S...

  19. A Computational Model of Cellular Response to Modulated Radiation Fields

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, Stephen J., E-mail: stephen.mcmahon@qub.ac.uk [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Butterworth, Karl T. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); McGarry, Conor K. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Trainor, Colman [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); O' Sullivan, Joe M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Clinical Oncology, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Belfast, Northern Ireland (United Kingdom); Hounsell, Alan R. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Prise, Kevin M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom)

    2012-09-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  20. A Computational Model of Cellular Response to Modulated Radiation Fields

    International Nuclear Information System (INIS)

    McMahon, Stephen J.; Butterworth, Karl T.; McGarry, Conor K.; Trainor, Colman; O’Sullivan, Joe M.; Hounsell, Alan R.; Prise, Kevin M.

    2012-01-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  1. In Law We Trust? Trusted Computing and Legal Responsibility for Internet Security

    Science.gov (United States)

    Danidou, Yianna; Schafer, Burkhard

    This paper analyses potential legal responses and consequences to the anticipated roll out of Trusted Computing (TC). It is argued that TC constitutes such a dramatic shift in power away from users to the software providers, that it is necessary for the legal system to respond. A possible response is to mirror the shift in power by a shift in legal responsibility, creating new legal liabilities and duties for software companies as the new guardians of internet security.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  3. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  4. Visual and psychological stress during computer work in healthy, young females-physiological responses.

    Science.gov (United States)

    Mork, Randi; Falkenberg, Helle K; Fostervold, Knut Inge; Thorud, Hanne Mari S

    2018-05-30

    Among computer workers, visual complaints, and neck pain are highly prevalent. This study explores how occupational simulated stressors during computer work, like glare and psychosocial stress, affect physiological responses in young females with normal vision. The study was a within-subject laboratory experiment with a counterbalanced, repeated design. Forty-three females performed four 10-min computer-work sessions with different stress exposures: (1) minimal stress; (2) visual stress (direct glare); (3) psychological stress; and (4) combined visual and psychological stress. Muscle activity and muscle blood flow in trapezius, muscle blood flow in orbicularis oculi, heart rate, blood pressure, blink rate and postural angles were continuously recorded. Immediately after each computer-work session, fixation disparity was measured and a questionnaire regarding perceived workstation lighting and stress was completed. Exposure to direct glare resulted in increased trapezius muscle blood flow, increased blink rate, and forward bending of the head. Psychological stress induced a transient increase in trapezius muscle activity and a more forward-bent posture. Bending forward towards the computer screen was correlated with higher productivity (reading speed), indicating a concentration or stress response. Forward bent posture was also associated with changes in fixation disparity. Furthermore, during computer work per se, trapezius muscle activity and blood flow, orbicularis oculi muscle blood flow, and heart rate were increased compared to rest. Exposure to glare and psychological stress during computer work were shown to influence the trapezius muscle, posture, and blink rate in young, healthy females with normal binocular vision, but in different ways. Accordingly, both visual and psychological factors must be taken into account when optimizing computer workstations to reduce physiological responses that may cause excessive eyestrain and musculoskeletal load.

  5. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  6. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  7. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  8. Evaluating and tuning system response in the MFTF-B control and diagnostics computers

    International Nuclear Information System (INIS)

    Palasek, R.L.; Butner, D.N.; Minor, E.G.

    1983-01-01

    The software system running on the Supervisory Control and Diagnostics System (SCDS) of MFTF-B is, for the major part, an event driven one. Regular, periodic polling of sensors' outputs takes place only at the local level, in the sensors' corresponding local control microcomputers (LCC's). An LCC reports a sensor's value to the supervisory computer only if there was a significant change. This report is passed as a message, routed among and acted upon by a network of applications and systems tasks within the supervisory computer (SCDS). Commands from the operator's console are similarly routed through a network of tasks, but in the oppostie direction to the experiment's hardware. In a network such as this, response time is partialy determined by system traffic. Because the hardware of MFTF-B will not be connected to the computer system for another two years, we are using the local control computers to simulate the event driven traffic that we expect to see during MFTF-B operation. In this paper we show how we are using the simulator to measure and evaluate response, loading, throughput, and utilization of components within the computer system. Measurement of the system under simulation allows us to identify bottlenecks and verify their unloosening. We also use the traffic simulators to evaluate prototypes of different algorithms for selected tasks, comparing their responses under the spectrum of traffic intensities

  9. Ark of Inquiry: Responsible Research and Innovation through Computer-Based Inquiry Learning

    NARCIS (Netherlands)

    Margus Pedaste; Leo Siiman; Bregje de Vries; Mirjam Burget; Tomi Jaakkola; Emanuele Bardone; Meelis Brikker; Mario Mäeots; Marianne Lind; Koen Veermans

    2015-01-01

    Ark of Inquiry is a learning platform that uses a computer-based inquiry learning approach to raise youth awareness to Responsible Research and Innovation (RRI). It is developed in the context of a large-scale European project (http://www.arkofinquiry.eu) and provides young European citizens

  10. Children's Responses to Computer-Synthesized Speech in Educational Media: Gender Consistency and Gender Similarity Effects

    Science.gov (United States)

    Lee, Kwan Min; Liao, Katharine; Ryu, Seoungho

    2007-01-01

    This study examines children's social responses to gender cues in synthesized speech in a computer-based instruction setting. Eighty 5th-grade elementary school children were randomly assigned to one of the conditions in a full-factorial 2 (participant gender) x 2 (voice gender) x 2 (content gender) experiment. Results show that children apply…

  11. The quantitative assessment of peri-implant bone responses using histomorphometry and micro-computed tomography.

    NARCIS (Netherlands)

    Schouten, C.; Meijer, G.J.; Beucken, J.J.J.P van den; Spauwen, P.H.M.; Jansen, J.A.

    2009-01-01

    In the present study, the effects of implant design and surface properties on peri-implant bone response were evaluated with both conventional histomorphometry and micro-computed tomography (micro-CT), using two geometrically different dental implants (Screw type, St; Push-in, Pi) either or not

  12. Molecular Imaging and Precision Medicine: PET/Computed Tomography and Therapy Response Assessment in Oncology.

    Science.gov (United States)

    Sheikhbahaei, Sara; Mena, Esther; Pattanayak, Puskar; Taghipour, Mehdi; Solnes, Lilja B; Subramaniam, Rathan M

    2017-01-01

    A variety of methods have been developed to assess tumor response to therapy. Standardized qualitative criteria based on 18F-fluoro-deoxyglucose PET/computed tomography have been proposed to evaluate the treatment effectiveness in specific cancers and these allow more accurate therapy response assessment and survival prognostication. Multiple studies have addressed the utility of the volumetric PET biomarkers as prognostic indicators but there is no consensus about the preferred segmentation methodology for these metrics. Heterogeneous intratumoral uptake was proposed as a novel PET metric for therapy response assessment. PET imaging techniques will be used to study the biological behavior of cancers during therapy. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Implementation of distributed computing system for emergency response and contaminant spill monitoring

    International Nuclear Information System (INIS)

    Ojo, T.O.; Sterling, M.C.Jr.; Bonner, J.S.; Fuller, C.B.; Kelly, F.; Page, C.A.

    2003-01-01

    The availability and use of real-time environmental data greatly enhances emergency response and spill monitoring in coastal and near shore environments. The data would include surface currents, wind speed, wind direction, and temperature. Model predictions (fate and transport) or forensics can also be included. In order to achieve an integrated system suitable for application in spill or emergency response situations, a link is required because this information exists on many different computing platforms. When real-time measurements are needed to monitor a spill, the use of a wide array of sensors and ship-based post-processing methods help reduce the latency in data transfer between field sampling stations and the Incident Command Centre. The common thread linking all these modules is the Transmission Control Protocol/Internet Protocol (TCP/IP), and the result is an integrated distributed computing system (DCS). The in-situ sensors are linked to an onboard computer through the use of a ship-based local area network (LAN) using a submersible device server. The onboard computer serves as both the data post-processor and communications server. It links the field sampling station with other modules, and is responsible for transferring data to the Incident Command Centre. This link is facilitated by a wide area network (WAN) based on wireless broadband communications facilities. This paper described the implementation of the DCS. The test results for the communications link and system readiness were also included. 6 refs., 2 tabs., 3 figs

  14. Psychophysiological Assessment Of Fear Experience In Response To Sound During Computer Video Gameplay

    DEFF Research Database (Denmark)

    Garner, Tom Alexander; Grimshaw, Mark

    2013-01-01

    The potential value of a looping biometric feedback system as a key component of adaptive computer video games is significant. Psychophysiological measures are essential to the development of an automated emotion recognition program, capable of interpreting physiological data into models of affect...... and systematically altering the game environment in response. This article presents empirical data the analysis of which advocates electrodermal activity and electromyography as suitable physiological measures to work effectively within a computer video game-based biometric feedback loop, within which sound...

  15. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    Science.gov (United States)

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  16. Computational Fluid Dynamics Simulation of Combustion Instability in Solid Rocket Motor : Implementation of Pressure Coupled Response Function

    OpenAIRE

    S. Saha; D. Chakraborty

    2016-01-01

    Combustion instability in solid propellant rocket motor is numerically simulated by implementing propellant response function with quasi steady homogeneous one dimensional formulation. The convolution integral of propellant response with pressure history is implemented through a user defined function in commercial computational fluid dynamics software. The methodology is validated against literature reported motor test and other simulation results. Computed amplitude of pressure fluctuations ...

  17. Towards an integrative computational model for simulating tumor growth and response to radiation therapy

    Science.gov (United States)

    Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar

    2017-11-01

    Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.

  18. Predictive computational modeling of the mucosal immune responses during Helicobacter pylori infection.

    Directory of Open Access Journals (Sweden)

    Adria Carbo

    Full Text Available T helper (Th cells play a major role in the immune response and pathology at the gastric mucosa during Helicobacter pylori infection. There is a limited mechanistic understanding regarding the contributions of CD4+ T cell subsets to gastritis development during H. pylori colonization. We used two computational approaches: ordinary differential equation (ODE-based and agent-based modeling (ABM to study the mechanisms underlying cellular immune responses to H. pylori and how CD4+ T cell subsets influenced initiation, progression and outcome of disease. To calibrate the model, in vivo experimentation was performed by infecting C57BL/6 mice intragastrically with H. pylori and assaying immune cell subsets in the stomach and gastric lymph nodes (GLN on days 0, 7, 14, 30 and 60 post-infection. Our computational model reproduced the dynamics of effector and regulatory pathways in the gastric lamina propria (LP in silico. Simulation results show the induction of a Th17 response and a dominant Th1 response, together with a regulatory response characterized by high levels of mucosal Treg cells. We also investigated the potential role of peroxisome proliferator-activated receptor γ (PPARγ activation on the modulation of host responses to H. pylori by using loss-of-function approaches. Specifically, in silico results showed a predominance of Th1 and Th17 cells in the stomach of the cell-specific PPARγ knockout system when compared to the wild-type simulation. Spatio-temporal, object-oriented ABM approaches suggested similar dynamics in induction of host responses showing analogous T cell distributions to ODE modeling and facilitated tracking lesion formation. In addition, sensitivity analysis predicted a crucial contribution of Th1 and Th17 effector responses as mediators of histopathological changes in the gastric mucosa during chronic stages of infection, which were experimentally validated in mice. These integrated immunoinformatics approaches

  19. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  1. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    International Nuclear Information System (INIS)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin; Hollingsworth, Alan B.; Qian, Wei

    2015-01-01

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy

  2. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin, E-mail: Bin.Zheng-1@ou.edu [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Hollingsworth, Alan B. [Mercy Women’s Center, Mercy Health Center, Oklahoma City, Oklahoma 73120 (United States); Qian, Wei [Department of Electrical and Computer Engineering, University of Texas, El Paso, Texas 79968 (United States)

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  3. New computational method for non-LTE, the linear response matrix

    International Nuclear Information System (INIS)

    Fournier, K.B.; Grasiani, F.R.; Harte, J.A.; Libby, S.B.; More, R.M.; Zimmerman, G.B.

    1998-01-01

    My coauthors have done extensive theoretical and computational calculations that lay the ground work for a linear response matrix method to calculate non-LTE (local thermodynamic equilibrium) opacities. I will give briefly review some of their work and list references. Then I will describe what has been done to utilize this theory to create a computational package to rapidly calculate mild non-LTE emission and absorption opacities suitable for use in hydrodynamic calculations. The opacities are obtained by performing table look-ups on data that has been generated with a non-LTE package. This scheme is currently under development. We can see that it offers a significant computational speed advantage. It is suitable for mild non-LTE, quasi-steady conditions. And it offers a new insertion path for high-quality non-LTE data. Currently, the linear response matrix data file is created using XSN. These data files could be generated by more detailed and rigorous calculations without changing any part of the implementation in the hydro code. The scheme is running in Lasnex and is being tested and developed

  4. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    Science.gov (United States)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  5. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. Computational methods for describing the laser-induced mechanical response of tissue

    Energy Technology Data Exchange (ETDEWEB)

    Trucano, T.; McGlaun, J.M.; Farnsworth, A.

    1994-02-01

    Detailed computational modeling of laser surgery requires treatment of the photoablation of human tissue by high intensity pulses of laser light and the subsequent thermomechanical response of the tissue. Three distinct physical regimes must be considered to accomplish this: (1) the immediate absorption of the laser pulse by the tissue and following tissue ablation, which is dependent upon tissue light absorption characteristics; (2) the near field thermal and mechanical response of the tissue to this laser pulse, and (3) the potential far field (and longer time) mechanical response of witness tissue. Both (2) and (3) are dependent upon accurate constitutive descriptions of the tissue. We will briefly review tissue absorptivity and mechanical behavior, with an emphasis on dynamic loads characteristic of the photoablation process. In this paper our focus will center on the requirements of numerical modeling and the uncertainties of mechanical tissue behavior under photoablation. We will also discuss potential contributions that computational simulations can make in the design of surgical protocols which utilize lasers, for example, in assessing the potential for collateral mechanical damage by laser pulses.

  10. A qualitatively validated mathematical-computational model of the immune response to the yellow fever vaccine.

    Science.gov (United States)

    Bonin, Carla R B; Fernandes, Guilherme C; Dos Santos, Rodrigo W; Lobosco, Marcelo

    2018-05-25

    Although a safe and effective yellow fever vaccine was developed more than 80 years ago, several issues regarding its use remain unclear. For example, what is the minimum dose that can provide immunity against the disease? A useful tool that can help researchers answer this and other related questions is a computational simulator that implements a mathematical model describing the human immune response to vaccination against yellow fever. This work uses a system of ten ordinary differential equations to represent a few important populations in the response process generated by the body after vaccination. The main populations include viruses, APCs, CD8+ T cells, short-lived and long-lived plasma cells, B cells and antibodies. In order to qualitatively validate our model, four experiments were carried out, and their computational results were compared to experimental data obtained from the literature. The four experiments were: a) simulation of a scenario in which an individual was vaccinated against yellow fever for the first time; b) simulation of a booster dose ten years after the first dose; c) simulation of the immune response to the yellow fever vaccine in individuals with different levels of naïve CD8+ T cells; and d) simulation of the immune response to distinct doses of the yellow fever vaccine. This work shows that the simulator was able to qualitatively reproduce some of the experimental results reported in the literature, such as the amount of antibodies and viremia throughout time, as well as to reproduce other behaviors of the immune response reported in the literature, such as those that occur after a booster dose of the vaccine.

  11. Semiquantitative visual approach to scoring lung cancer treatment response using computed tomography: a pilot study.

    Science.gov (United States)

    Gottlieb, Ronald H; Kumar, Prasanna; Loud, Peter; Klippenstein, Donald; Raczyk, Cheryl; Tan, Wei; Lu, Jenny; Ramnath, Nithya

    2009-01-01

    Our objective was to compare a newly developed semiquantitative visual scoring (SVS) method with the current standard, the Response Evaluation Criteria in Solid Tumors (RECIST) method, in the categorization of treatment response and reader agreement for patients with metastatic lung cancer followed by computed tomography. The 18 subjects (5 women and 13 men; mean age, 62.8 years) were from an institutional review board-approved phase 2 study that evaluated a second-line chemotherapy regimen for metastatic (stages III and IV) non-small cell lung cancer. Four radiologists, blinded to the patient outcome and each other's reads, evaluated the change in the patients' tumor burden from the baseline to the first restaging computed tomographic scan using either the RECIST or the SVS method. We compared the numbers of patients placed into the partial response, the stable disease (SD), and the progressive disease (PD) categories (Fisher exact test) and observer agreement (kappa statistic). Requiring the concordance of 3 of the 4 readers resulted in the RECIST placing 17 (100%) of 17 patients in the SD category compared with the SVS placing 9 (60%) of 15 patients in the partial response, 5 (33%) of the 15 patients in the SD, and 1 (6.7%) of the 15 patients in the PD categories (P < 0.0001). Interobserver agreement was higher among the readers using the SVS method (kappa, 0.54; P < 0.0001) compared with that of the readers using the RECIST method (kappa, -0.01; P = 0.5378). Using the SVS method, the readers more finely discriminated between the patient response categories with superior agreement compared with the RECIST method, which could potentially result in large differences in early treatment decisions for advanced lung cancer.

  12. Computational Analysis of Single Nucleotide Polymorphisms Associated with Altered Drug Responsiveness in Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Valerio Costa

    2016-06-01

    Full Text Available Type 2 diabetes (T2D is one of the most frequent mortality causes in western countries, with rapidly increasing prevalence. Anti-diabetic drugs are the first therapeutic approach, although many patients develop drug resistance. Most drug responsiveness variability can be explained by genetic causes. Inter-individual variability is principally due to single nucleotide polymorphisms, and differential drug responsiveness has been correlated to alteration in genes involved in drug metabolism (CYP2C9 or insulin signaling (IRS1, ABCC8, KCNJ11 and PPARG. However, most genome-wide association studies did not provide clues about the contribution of DNA variations to impaired drug responsiveness. Thus, characterizing T2D drug responsiveness variants is needed to guide clinicians toward tailored therapeutic approaches. Here, we extensively investigated polymorphisms associated with altered drug response in T2D, predicting their effects in silico. Combining different computational approaches, we focused on the expression pattern of genes correlated to drug resistance and inferred evolutionary conservation of polymorphic residues, computationally predicting the biochemical properties of polymorphic proteins. Using RNA-Sequencing followed by targeted validation, we identified and experimentally confirmed that two nucleotide variations in the CAPN10 gene—currently annotated as intronic—fall within two new transcripts in this locus. Additionally, we found that a Single Nucleotide Polymorphism (SNP, currently reported as intergenic, maps to the intron of a new transcript, harboring CAPN10 and GPR35 genes, which undergoes non-sense mediated decay. Finally, we analyzed variants that fall into non-coding regulatory regions of yet underestimated functional significance, predicting that some of them can potentially affect gene expression and/or post-transcriptional regulation of mRNAs affecting the splicing.

  13. Impaired Expected Value Computations Coupled With Overreliance on Stimulus-Response Learning in Schizophrenia.

    Science.gov (United States)

    Hernaus, Dennis; Gold, James M; Waltz, James A; Frank, Michael J

    2018-04-03

    While many have emphasized impaired reward prediction error signaling in schizophrenia, multiple studies suggest that some decision-making deficits may arise from overreliance on stimulus-response systems together with a compromised ability to represent expected value. Guided by computational frameworks, we formulated and tested two scenarios in which maladaptive representations of expected value should be most evident, thereby delineating conditions that may evoke decision-making impairments in schizophrenia. In a modified reinforcement learning paradigm, 42 medicated people with schizophrenia and 36 healthy volunteers learned to select the most frequently rewarded option in a 75-25 pair: once when presented with a more deterministic (90-10) pair and once when presented with a more probabilistic (60-40) pair. Novel and old combinations of choice options were presented in a subsequent transfer phase. Computational modeling was employed to elucidate contributions from stimulus-response systems (actor-critic) and expected value (Q-learning). People with schizophrenia showed robust performance impairments with increasing value difference between two competing options, which strongly correlated with decreased contributions from expected value-based learning (Q-learning). Moreover, a subtle yet consistent contextual choice bias for the probabilistic 75 option was present in people with schizophrenia, which could be accounted for by a context-dependent reward prediction error in the actor-critic. We provide evidence that decision-making impairments in schizophrenia increase monotonically with demands placed on expected value computations. A contextual choice bias is consistent with overreliance on stimulus-response learning, which may signify a deficit secondary to the maladaptive representation of expected value. These results shed new light on conditions under which decision-making impairments may arise. Copyright © 2018 Society of Biological Psychiatry. Published by

  14. SHOCK-JR: a computer program to analyze impact response of shipping container

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nakazato, Chikara; Shimoda, Osamu; Uchino, Mamoru.

    1983-02-01

    The report is provided for using a computer program, SHOCK-JR, which is used to analyze the impact response of shipping containers. Descriptions are the mathematical model, method of analysis, structures of the program and the input and output variables. The program solves the equations of motion for a one-dimensional, lumped mass and nonlinear spring model. The solution procedure uses Runge-Kutta-Gill and Newmark-β methods. SHOCK-JR is a revised version of SHOCK, which was developed by ORNL. In SHOCK-JR, SI dimension is used and graphical output is available. (author)

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  16. Computational Model and Numerical Simulation for Submerged Mooring Monitoring Platform’s Dynamical Response

    Directory of Open Access Journals (Sweden)

    He Kongde

    2015-01-01

    Full Text Available Computational model and numerical simulation for submerged mooring monitoring platform were formulated aimed at the dynamical response by the action of flow force, which based on Hopkinson impact load theory, taken into account the catenoid effect of mooring cable and revised the difference of tension and tangential direction action force by equivalent modulus of elasticity. Solved the equation by hydraulics theory and structural mechanics theory of oceaneering, studied the response of buoy on flow force. The validity of model were checked and the results were in good agreement; the result show the buoy will engender biggish heave and swaying displacement, but the swaying displacement got stable quickly and the heaven displacement cause vibration for the vortex-induced action by the flow.

  17. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    Science.gov (United States)

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. Behavioral response and pain perception to computer controlled local anesthetic delivery system and cartridge syringe

    Directory of Open Access Journals (Sweden)

    T D Yogesh Kumar

    2015-01-01

    Full Text Available Aim: The present study evaluated and compared the pain perception, behavioral response, physiological parameters, and the role of topical anesthetic administration during local anesthetic administration with cartridge syringe and computer controlled local anesthetic delivery system (CCLAD. Design: A randomized controlled crossover study was carried out with 120 children aged 7-11 years. They were randomly divided into Group A: Receiving injection with CCLAD during first visit; Group B: Receiving injection with cartridge syringe during first visit. They were further subdivided into three subgroups based on the topical application used: (a 20% benzocaine; (b pressure with cotton applicator; (c no topical application. Pulse rate and blood pressure were recorded before and during injection procedure. Objective evaluation of disruptive behavior and subjective evaluation of pain were done using face legs activity cry consolability scale and modified facial image scale, respectively. The washout period between the two visits was 1-week. Results: Injections with CCLAD produced significantly lesser pain response, disruptive behavior (P < 0.001, and pulse rate (P < 0.05 when compared to cartridge syringe injections. Application of benzocaine produced lesser pain response and disruptive behavior when compared to the other two subgroups, although the result was not significant. Conclusion: Usage of techniques which enhance behavioral response in children like injections with CCLAD can be considered as a possible step toward achieving a pain-free pediatric dental practice.

  1. A noninvasive brain computer interface using visually-induced near-infrared spectroscopy responses.

    Science.gov (United States)

    Chen, Cheng-Hsuan; Ho, Ming-Shan; Shyu, Kuo-Kai; Hsu, Kou-Cheng; Wang, Kuo-Wei; Lee, Po-Lei

    2014-09-19

    Visually-induced near-infrared spectroscopy (NIRS) response was utilized to design a brain computer interface (BCI) system. Four circular checkerboards driven by distinct flickering sequences were displayed on a LCD screen as visual stimuli to induce subjects' NIRS responses. Each flickering sequence was a concatenated sequence of alternative flickering segments and resting segments. The flickering segment was designed with fixed duration of 3s whereas the resting segment was chosen randomly within 15-20s to create the mutual independencies among different flickering sequences. Six subjects were recruited in this study and subjects were requested to gaze at the four visual stimuli one-after-one in a random order. Since visual responses in human brain are time-locked to the onsets of visual stimuli and the flicker sequences of distinct visual stimuli were designed mutually independent, the NIRS responses induced by user's gazed targets can be discerned from non-gazed targets by applying a simple averaging process. The accuracies for the six subjects were higher than 90% after 10 or more epochs being averaged. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. A three-dimensional computer code for the nonlinear dynamic response of an HTGR core

    International Nuclear Information System (INIS)

    Subudhi, M.; Lasker, L.; Koplik, B.; Curreri, J.; Goradia, H.

    1979-01-01

    A three-dimensional dynamic code has been developed to determine the nonlinear response of an HTGR core. The HTGR core consists of several thousands of hexagonal core blocks. These are arranged in layers stacked together. Each layer contains many core blocks surrounded on their outer periphery by reflector blocks. The entire assembly is contained within a prestressed concrete reactor vessel. Gaps exist between adjacent blocks in any horizontal plane. Each core block in a given layer is connected to the blocks directly above and below it via three dowell pins. The present analytical study is directed towards an investigation of the nonlinear response of the reactor core blocks in the event of a seismic occurrence. The computer code is developed for a specific mathematical model which represents a vertical arrangement of layers of blocks. This comprises a 'block module' of core elements which would be obtained by cutting a cylindrical portion consisting of seven fuel blocks per layer. It is anticipated that a number of such modules properly arranged could represent the entire core. Hence, the predicted response of this module would exhibit the response characteristics of the core. (orig.)

  3. Three-dimensional computer code for the nonlinear dynamic response of an HTGR core

    International Nuclear Information System (INIS)

    Subudhi, M.; Lasker, L.; Koplik, B.; Curreri, J.; Goradia, H.

    1979-01-01

    A three-dimensional dynamic code has been developed to determine the nonlinear response of an HTGR core. The HTGR core consists of several thousands of hexagonal core blocks. These are arranged inlayers stacked together. Each layer contains many core blocks surrounded on their outer periphery by reflector blocks. The entire assembly is contained within a prestressed concrete reactor vessel. Gaps exist between adjacent blocks in any horizontal plane. Each core block in a given layer is connected to the blocks directly above and below it via three dowell pins. The present analystical study is directed towards an invesstigation of the nonlinear response of the reactor core blocks in the event of a seismic occurrence. The computer code is developed for a specific mathemtical model which represents a vertical arrangement of layers of blocks. This comprises a block module of core elements which would be obtained by cutting a cylindrical portion consisting of seven fuel blocks per layer. It is anticipated that a number of such modules properly arranged could represent the entire core. Hence, the predicted response of this module would exhibit the response characteristics of the core

  4. Positron computed tomography studies of cerebral metabolic responses to complex motor tasks

    International Nuclear Information System (INIS)

    Phelps, M.E.; Mazziotta, J.C.

    1984-01-01

    Human motor system organization was explored in 8 right-handed male subjects using /sup 18/F-fluorodeoxyglucose and positron computed tomography to measure cerebral glucose metabolism. Five subjects had triple studies (eyes closed) including: control (hold pen in right hand without moving), normal size writing (subject repeatedly writes name) and large (10-15 X normal) name writing. In these studies normal and large size writing had a similar distribution of metabolic responses when compared to control studies. Activations (percent change from control) were in the range of 12-20% and occurred in the striatum bilaterally > contralateral Rolandic cortex > contralateral thalamus. No significant activations were observed in the ipsilateral thalamus, Rolandic cortex or cerebellum (supplementary motor cortex was not examined). The magnitude of the metabolic response in the striatum was greater with the large versus normal sized writing. This differential response may be due to an increased number and topographic distribution of neurons responding with the same average activity between tasks or an increase in the functional activity of the same neuronal population between the two tasks (present spatial resolution inadequate to differentiate). When subjects (N=3) performed novel sequential finger movements, the maximal metabolic response was in the contralateral Rolandic cortex > striatum. Such studies provide a means of exploring human motor system organization, motor learning and provide a basis for examining patients with motor system disorders

  5. Computational biomechanics of bone's responses to dental prostheses - osseointegration, remodeling and resorption

    International Nuclear Information System (INIS)

    Li Wei; Rungsiyakull, Chaiy; Field, Clarice; Lin, Daniel; Zhang Leo; Li Qing; Swain, Michael

    2010-01-01

    Clinical and experimental studies showed that human bone has the ability to remodel itself to better adapt to its biomechanical environment by changing both its material properties and geometry. As a consequence of the rapid development and extensive applications of major dental restorations such as implantation and fixed partial denture (FPD), the effect of bone remodeling on the success of a dental restorative surgery is becoming critical for prosthetic design and pre-surgical assessment. This paper aims to provide a computational biomechanics framework to address dental bone's responses as a result of dental restoration. It explored three important issues of resorption, apposition and osseointegration in terms of remodeling simulation. The published remodeling data in long bones were regulated to drive the computational remodeling prediction for the dental bones by correlating the results to clinical data. It is anticipated that the study will provide a more predictive model of dental bone response and help develop a new design methodology for patient-specific dental prosthetic restoration.

  6. Experimental and computational analysis of pressure response in a multiphase flow loop

    Science.gov (United States)

    Morshed, Munzarin; Amin, Al; Rahman, Mohammad Azizur; Imtiaz, Syed

    2016-07-01

    The characteristics of multiphase fluid flow in pipes are useful to understand fluid mechanics encountered in the oil and gas industries. In the present day oil and gas exploration is successively inducing subsea operation in the deep sea and arctic condition. During the transport of petroleum products, understanding the fluid dynamics inside the pipe network is important for flow assurance. In this case the information regarding static and dynamic pressure response, pressure loss, optimum flow rate, pipe diameter etc. are the important parameter for flow assurance. The principal aim of this research is to represents computational analysis and experimental analysis of multi-phase (L/G) in a pipe network. This computational study considers a two-phase fluid flow through a horizontal flow loop with at different Reynolds number in order to determine the pressure distribution, frictional pressure loss profiles by volume of fluid (VOF) method. However, numerical simulations are validated with the experimental data. The experiment is conducted in 76.20 mm ID transparent circular pipe using water and air in the flow loop. Static pressure transducers are used to measure local pressure response in multiphase pipeline.

  7. Analysis of the computational methods on the equipment shock response based on ANSYS environments

    International Nuclear Information System (INIS)

    Wang Yu; Li Zhaojun

    2005-01-01

    With the developments and completions of equipment shock vibration theory, math calculation method simulation technique and other aspects, equipment shock calculation methods are gradually developing form static development to dynamic and from linearity to non-linearity. Now, the equipment shock calculation methods applied worldwide in engineering practices mostly include equivalent static force method, Dynamic Design Analysis Method (abbreviated to DDAM) and real-time simulation method. The DDAM is a method based on the modal analysis theory, which inputs the shock design spectrum as shock load and gets hold of the shock response of the integrated system by applying separate cross-modal integrating method within the frequency domain. The real-time simulation method is to carry through the computational analysis of the equipment shock response within the time domain, use the time-history curves obtained from real-time measurement or spectrum transformation as the equipment shock load and find an iterative solution of a differential equation of the system movement by using the computational procedure within the time domain. Conclusions: Using the separate DDAM and Real-time Simulation Method, this paper carried through the shock analysis of a three-dimensional frame floating raft in ANSYS environments, analyzed the result, and drew the following conclusion: Because DDAM does not calculate damping, non-linear effect and phase difference between mode responses, the result is much bigger than that of real-time simulation method. The coupling response is much complex when the mode result of 3-dimension structure is being calculated, and the coupling response of non-shock direction is also much bigger than that of real-time simulation method when DDAM is applied. Both DDAM and real-time simulation method has its good points and scope of application. The designers should select the design method that is economic and in point according to the features and anti

  8. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    Science.gov (United States)

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  9. Computation of Dielectric Response in Molecular Solids for High Capacitance Organic Dielectrics.

    Science.gov (United States)

    Heitzer, Henry M; Marks, Tobin J; Ratner, Mark A

    2016-09-20

    The dielectric response of a material is central to numerous processes spanning the fields of chemistry, materials science, biology, and physics. Despite this broad importance across these disciplines, describing the dielectric environment of a molecular system at the level of first-principles theory and computation remains a great challenge and is of importance to understand the behavior of existing systems as well as to guide the design and synthetic realization of new ones. Furthermore, with recent advances in molecular electronics, nanotechnology, and molecular biology, it has become necessary to predict the dielectric properties of molecular systems that are often difficult or impossible to measure experimentally. In these scenarios, it is would be highly desirable to be able to determine dielectric response through efficient, accurate, and chemically informative calculations. A good example of where theoretical modeling of dielectric response would be valuable is in the development of high-capacitance organic gate dielectrics for unconventional electronics such as those that could be fabricated by high-throughput printing techniques. Gate dielectrics are fundamental components of all transistor-based logic circuitry, and the combination high dielectric constant and nanoscopic thickness (i.e., high capacitance) is essential to achieving high switching speeds and low power consumption. Molecule-based dielectrics offer the promise of cheap, flexible, and mass producible electronics when used in conjunction with unconventional organic or inorganic semiconducting materials to fabricate organic field effect transistors (OFETs). The molecular dielectrics developed to date typically have limited dielectric response, which results in low capacitances, translating into poor performance of the resulting OFETs. Furthermore, the development of better performing dielectric materials has been hindered by the current highly empirical and labor-intensive pace of synthetic

  10. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    Science.gov (United States)

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  20. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  3. Considerations on command and response language features for a network of heterogeneous autonomous computers

    Science.gov (United States)

    Engelberg, N.; Shaw, C., III

    1984-01-01

    The design of a uniform command language to be used in a local area network of heterogeneous, autonomous nodes is considered. After examining the major characteristics of such a network, and after considering the profile of a scientist using the computers on the net as an investigative aid, a set of reasonable requirements for the command language are derived. Taking into account the possible inefficiencies in implementing a guest-layered network operating system and command language on a heterogeneous net, the authors examine command language naming, process/procedure invocation, parameter acquisition, help and response facilities, and other features found in single-node command languages, and conclude that some features may extend simply to the network case, others extend after some restrictions are imposed, and still others require modifications. In addition, it is noted that some requirements considered reasonable (user accounting reports, for example) demand further study before they can be efficiently implemented on a network of the sort described.

  4. Application of a brain-computer interface for person authentication using EEG responses to photo stimuli.

    Science.gov (United States)

    Mu, Zhendong; Yin, Jinhai; Hu, Jianfeng

    2018-01-01

    In this paper, a person authentication system that can effectively identify individuals by generating unique electroencephalogram signal features in response to self-face and non-self-face photos is presented. In order to achieve a good stability performance, the sequence of self-face photo including first-occurrence position and non-first-occurrence position are taken into account in the serial occurrence of visual stimuli. In addition, a Fisher linear classification method and event-related potential technique for feature analysis is adapted to yield remarkably better outcomes than that by most of the existing methods in the field. The results have shown that the EEG-based person authentications via brain-computer interface can be considered as a suitable approach for biometric authentication system.

  5. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  6. A computational relationship between thalamic sensory neural responses and contrast perception.

    Science.gov (United States)

    Jiang, Yaoguang; Purushothaman, Gopathy; Casagrande, Vivien A

    2015-01-01

    Uncovering the relationship between sensory neural responses and perceptual decisions remains a fundamental problem in neuroscience. Decades of experimental and modeling work in the sensory cortex have demonstrated that a perceptual decision pool is usually composed of tens to hundreds of neurons, the responses of which are significantly correlated not only with each other, but also with the behavioral choices of an animal. Few studies, however, have measured neural activity in the sensory thalamus of awake, behaving animals. Therefore, it remains unclear how many thalamic neurons are recruited and how the information from these neurons is pooled at subsequent cortical stages to form a perceptual decision. In a previous study we measured neural activity in the macaque lateral geniculate nucleus (LGN) during a two alternative forced choice (2AFC) contrast detection task, and found that single LGN neurons were significantly correlated with the monkeys' behavioral choices, despite their relatively poor contrast sensitivity and a lack of overall interneuronal correlations. We have now computationally tested a number of specific hypotheses relating these measured LGN neural responses to the contrast detection behavior of the animals. We modeled the perceptual decisions with different numbers of neurons and using a variety of pooling/readout strategies, and found that the most successful model consisted of about 50-200 LGN neurons, with individual neurons weighted differentially according to their signal-to-noise ratios (quantified as d-primes). These results supported the hypothesis that in contrast detection the perceptual decision pool consists of multiple thalamic neurons, and that the response fluctuations in these neurons can influence contrast perception, with the more sensitive thalamic neurons likely to exert a greater influence.

  7. Impulse-response analysis of planar computed tomography for nondestructive test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Cheon; Kim, Seung Ho; Kim, Ho Kyung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There have been reported that the use of radiation imaging such as digital radiography, computed tomography (CT), and digital tomosynthesis (DTS) for the nondestructive test (NDT) widely is spreading. These methods have merits and demerits of their own, in terms of image quality and inspection speed. Therefore, image for these methods for NDT should have acceptable image quality and high speed. In this study, we quantitatively evaluate impulse responses of reconstructed images from the filtered backprojection (FBP), which are most widely used in planar computed tomography (pCT) systems. We first evaluate image performance metrics due to the contrast, depth resolution, and then we design the figure of merit including image performance and system parameters, such as tube load and reconstruction speed. The final goal of this study is the application of these methods to the nondestructive test. In order to accomplish it, further study is needed. First of all, the results of the ASF from various numbers of views. Second, the analysis of modulation transfer function, noise power spectrum, and detective quantum efficiency from various angular range and numbers of views.

  8. A computer tool for a minimax criterion in binary response and heteroscedastic simple linear regression models.

    Science.gov (United States)

    Casero-Alonso, V; López-Fidalgo, J; Torsney, B

    2017-01-01

    Binary response models are used in many real applications. For these models the Fisher information matrix (FIM) is proportional to the FIM of a weighted simple linear regression model. The same is also true when the weight function has a finite integral. Thus, optimal designs for one binary model are also optimal for the corresponding weighted linear regression model. The main objective of this paper is to provide a tool for the construction of MV-optimal designs, minimizing the maximum of the variances of the estimates, for a general design space. MV-optimality is a potentially difficult criterion because of its nondifferentiability at equal variance designs. A methodology for obtaining MV-optimal designs where the design space is a compact interval [a, b] will be given for several standard weight functions. The methodology will allow us to build a user-friendly computer tool based on Mathematica to compute MV-optimal designs. Some illustrative examples will show a representation of MV-optimal designs in the Euclidean plane, taking a and b as the axes. The applet will be explained using two relevant models. In the first one the case of a weighted linear regression model is considered, where the weight function is directly chosen from a typical family. In the second example a binary response model is assumed, where the probability of the outcome is given by a typical probability distribution. Practitioners can use the provided applet to identify the solution and to know the exact support points and design weights. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. A statistical mechanical approach for the computation of the climatic response to general forcings

    Directory of Open Access Journals (Sweden)

    V. Lucarini

    2011-01-01

    Full Text Available The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing

  11. Neural and cortisol responses during play with human and computer partners in children with autism

    Science.gov (United States)

    Edmiston, Elliot Kale; Merkle, Kristen

    2015-01-01

    Children with autism spectrum disorder (ASD) exhibit impairment in reciprocal social interactions, including play, which can manifest as failure to show social preference or discrimination between social and nonsocial stimuli. To explore mechanisms underlying these deficits, we collected salivary cortisol from 42 children 8–12 years with ASD or typical development during a playground interaction with a confederate child. Participants underwent functional MRI during a prisoner’s dilemma game requiring cooperation or defection with a human (confederate) or computer partner. Search region of interest analyses were based on previous research (e.g. insula, amygdala, temporal parietal junction—TPJ). There were significant group differences in neural activation based on partner and response pattern. When playing with a human partner, children with ASD showed limited engagement of a social salience brain circuit during defection. Reduced insula activation during defection in the ASD children relative to TD children, regardless of partner type, was also a prominent finding. Insula and TPJ BOLD during defection was also associated with stress responsivity and behavior in the ASD group under playground conditions. Children with ASD engage social salience networks less than TD children during conditions of social salience, supporting a fundamental disturbance of social engagement. PMID:25552572

  12. Brushless DC motor control system responsive to control signals generated by a computer or the like

    Science.gov (United States)

    Packard, Douglas T. (Inventor); Schmitt, Donald E. (Inventor)

    1987-01-01

    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The rotor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor may be regulated by applying a separate control signal to each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  13. Computational model of dose response for low-LET-induced complex chromosomal aberrations

    International Nuclear Information System (INIS)

    Eidelman, Y.A.; Andreev, S.G.

    2015-01-01

    Experiments with full-colour mFISH chromosome painting have revealed high yield of radiation-induced complex chromosomal aberrations (CAs). The ratio of complex to simple aberrations is dependent on cell type and linear energy transfer. Theoretical analysis has demonstrated that the mechanism of CA formation as a result of interaction between lesions at a surface of chromosome territories does not explain high complexes-to-simples ratio in human lymphocytes. The possible origin of high yields of γ-induced complex CAs was investigated in the present work by computer simulation. CAs were studied on the basis of chromosome structure and dynamics modelling and the hypothesis of CA formation on nuclear centres. The spatial organisation of all chromosomes in a human interphase nucleus was predicted by simulation of mitosis-to-interphase chromosome structure transition. Two scenarios of CA formation were analysed, 'static' (existing in a nucleus prior to irradiation) centres and 'dynamic' (formed in response to irradiation) centres. The modelling results reveal that under certain conditions, both scenarios explain quantitatively the dose-response relationships for both simple and complex γ-induced inter-chromosomal exchanges observed by mFISH chromosome painting in the first post-irradiation mitosis in human lymphocytes. (authors)

  14. Computational Model of Antidepressant Response Heterogeneity as Multi-pathway Neuroadaptation

    Directory of Open Access Journals (Sweden)

    Mariam B. Camacho

    2017-12-01

    Full Text Available Current hypotheses cannot fully explain the clinically observed heterogeneity in antidepressant response. The therapeutic latency of antidepressants suggests that therapeutic outcomes are achieved not by the acute effects of the drugs, but rather by the homeostatic changes that occur as the brain adapts to their chronic administration. We present a computational model that represents the known interactions between the monoaminergic neurotransmitter-producing brain regions and associated non-monoaminergic neurotransmitter systems, and use the model to explore the possible ways in which the brain can homeostatically adjust to chronic antidepressant administration. The model also represents the neuron-specific neurotransmitter receptors that are known to adjust their strengths (expressions or sensitivities in response to chronic antidepressant administration, and neuroadaptation in the model occurs through sequential adjustments in these receptor strengths. The main result is that the model can reach similar levels of adaptation to chronic administration of the same antidepressant drug or combination along many different pathways, arriving correspondingly at many different receptor strength configurations, but not all of those adapted configurations are also associated with therapeutic elevations in monoamine levels. When expressed as the percentage of adapted configurations that are also associated with elevations in one or more of the monoamines, our modeling results largely agree with the percentage efficacy rates of antidepressants and antidepressant combinations observed in clinical trials. Our neuroadaptation model provides an explanation for the clinical reports of heterogeneous outcomes among patients chronically administered the same antidepressant drug regimen.

  15. The quantitative assessment of peri-implant bone responses using histomorphometry and micro-computed tomography.

    Science.gov (United States)

    Schouten, Corinne; Meijer, Gert J; van den Beucken, Jeroen J J P; Spauwen, Paul H M; Jansen, John A

    2009-09-01

    In the present study, the effects of implant design and surface properties on peri-implant bone response were evaluated with both conventional histomorphometry and micro-computed tomography (micro-CT), using two geometrically different dental implants (Screw type, St; Push-in, Pi) either or not surface-modified (non-coated, CaP-coated, or CaP-coated+TGF-beta1). After 12 weeks of implantation in a goat femoral condyle model, peri-implant bone response was evaluated in three different zones (inner: 0-500 microm; middle: 500-1000 microm; and outer: 1000-1500 microm) around the implant. Results indicated superiority of conventional histomorphometry over micro-CT, as the latter is hampered by deficits in the discrimination at the implant/tissue interface. Beyond this interface, both analysis techniques can be regarded as complementary. Histomorphometrical analysis showed an overall higher bone volume around St compared to Pi implants, but no effects of surface modification were observed. St implants showed lowest bone volumes in the outer zone, whereas inner zones were lowest for Pi implants. These results implicate that for Pi implants bone formation started from two different directions (contact- and distance osteogenesis). For St implants it was concluded that undersized implantation technique and loosening of bone fragments compress the zones for contact and distant osteogenesis, thereby improving bone volume at the interface significantly.

  16. Topics in Modeling of Cochlear Dynamics: Computation, Response and Stability Analysis

    Science.gov (United States)

    Filo, Maurice G.

    This thesis touches upon several topics in cochlear modeling. Throughout the literature, mathematical models of the cochlea vary according to the degree of biological realism to be incorporated. This thesis casts the cochlear model as a continuous space-time dynamical system using operator language. This framework encompasses a wider class of cochlear models and makes the dynamics more transparent and easier to analyze before applying any numerical method to discretize space. In fact, several numerical methods are investigated to study the computational efficiency of the finite dimensional realizations in space. Furthermore, we study the effects of the active gain perturbations on the stability of the linearized dynamics. The stability analysis is used to explain possible mechanisms underlying spontaneous otoacoustic emissions and tinnitus. Dynamic Mode Decomposition (DMD) is introduced as a useful tool to analyze the response of nonlinear cochlear models. Cochlear response features are illustrated using DMD which has the advantage of explicitly revealing the spatial modes of vibrations occurring in the Basilar Membrane (BM). Finally, we address the dynamic estimation problem of BM vibrations using Extended Kalman Filters (EKF). Due to the limitations of noninvasive sensing schemes, such algorithms are inevitable to estimate the dynamic behavior of a living cochlea.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  18. Operational mesoscale atmospheric dispersion prediction using high performance parallel computing cluster for emergency response

    International Nuclear Information System (INIS)

    Srinivas, C.V.; Venkatesan, R.; Muralidharan, N.V.; Das, Someshwar; Dass, Hari; Eswara Kumar, P.

    2005-08-01

    An operational atmospheric dispersion prediction system is implemented on a cluster super computer for 'Online Emergency Response' for Kalpakkam nuclear site. The numerical system constitutes a parallel version of a nested grid meso-scale meteorological model MM5 coupled to a random walk particle dispersion model FLEXPART. The system provides 48 hour forecast of the local weather and radioactive plume dispersion due to hypothetical air borne releases in a range of 100 km around the site. The parallel code was implemented on different cluster configurations like distributed and shared memory systems. Results of MM5 run time performance for 1-day prediction are reported on all the machines available for testing. A reduction of 5 times in runtime is achieved using 9 dual Xeon nodes (18 physical/36 logical processors) compared to a single node sequential run. Based on the above run time results a cluster computer facility with 9-node Dual Xeon is commissioned at IGCAR for model operation. The run time of a triple nested domain MM5 is about 4 h for 24 h forecast. The system has been operated continuously for a few months and results were ported on the IMSc home page. Initial and periodic boundary condition data for MM5 are provided by NCMRWF, New Delhi. An alternative source is found to be NCEP, USA. These two sources provide the input data to the operational models at different spatial and temporal resolutions and using different assimilation methods. A comparative study on the results of forecast is presented using these two data sources for present operational use. Slight improvement is noticed in rainfall, winds, geopotential heights and the vertical atmospheric structure while using NCEP data probably because of its high spatial and temporal resolution. (author)

  19. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    Directory of Open Access Journals (Sweden)

    James M Leaming

    2013-05-01

    Full Text Available Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations’ ability to assess participants’ beliefs regarding their own hospitals’ mass casualty incident preparedness.Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155 were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees.Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities.Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. [West J Emerg Med. 2013;14(3:236–242.

  20. Cloud Computing for Science Data Processing in Support of Emergency Response

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing enables users to create virtual computers, each one with the optimal configuration of hardware and software for a job. The number of virtual...

  1. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy.

    Science.gov (United States)

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, María J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2017-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Seminal parameters values in the fertile group were very far above from those of the patients, either before or after surgery. No significant improvement in the percentage normal sperm morphology (P = 0.10), sperm concentration (P = 0.52), total sperm count (P = 0.76), subjective motility (%) (P = 0.97) nor kinematics (P = 0.30) was observed after varicocelectomy when all groups were compared. Neither was significant improvement found in percentage normal sperm morphology (P = 0.91), sperm concentration (P = 0.10), total sperm count (P = 0.89) or percentage motility (P = 0.77) after varicocelectomy in paired comparisons of preoperative and postoperative data. Analysis of paired samples revealed that the total sperm count (P = 0.01) and most sperm kinetic parameters: curvilinear velocity (P = 0.002), straight-line velocity (P = 0.0004), average path velocity (P = 0.0005), linearity (P = 0.02), and wobble (P = 0.006) improved after surgery. CASA offers the potential for accurate quantitative assessment of each patient's response to varicocelectomy.

  2. A Multiscale Computational Model of the Response of Swine Epidermis After Acute Irradiation

    Science.gov (United States)

    Hu, Shaowen; Cucinotta, Francis A.

    2012-01-01

    Radiation exposure from Solar Particle Events can lead to very high skin dose for astronauts on exploration missions outside the protection of the Earth s magnetic field [1]. Assessing the detrimental effects to human skin under such adverse conditions could be predicted by conducting territorial experiments on animal models. In this study we apply a computational approach to simulate the experimental data of the radiation response of swine epidermis, which is closely similar to human epidermis [2]. Incorporating experimentally measured histological and cell kinetic parameters into a multiscale tissue modeling framework, we obtain results of population kinetics and proliferation index comparable to unirradiated and acutely irradiated swine experiments [3]. It is noted the basal cell doubling time is 10 to 16 days in the intact population, but drops to 13.6 hr in the regenerating populations surviving irradiation. This complex 30-fold variation is proposed to be attributed to the shortening of the G1 phase duration. We investigate this radiation induced effect by considering at the sub-cellular level the expression and signaling of TGF-beta, as it is recognized as a key regulatory factor of tissue formation and wound healing [4]. This integrated model will allow us to test the validity of various basic biological rules at the cellular level and sub-cellular mechanisms by qualitatively comparing simulation results with published research, and should lead to a fuller understanding of the pathophysiological effects of ionizing radiation on the skin.

  3. Computer simulation of the hydroelastic response of a pressurized water reactor to a sudden depressurization

    International Nuclear Information System (INIS)

    Dienes, J.K.; Hirt, C.W.; Stein, L.R.

    1977-03-01

    A computer program is being developed to analyze the response of the core support barrel to a sudden loss of coolant in a pressurized water reactor. This program, SOLA-FLX, combines SOLA-DF, a two-dimensional, two-phase, hydrodynamic code with FLX, a finite-difference code that integrates the Timoshenko equations of elastic shell motion. The programs are coupled so that the shell motion determined by FLX is used as a boundary condition by SOLA. In turn, the pressure determined by SOLA is the forcing term that controls the shell motion. An axisymmetric version was first developed to provide a basis for comparing with a simple set of experiments and to serve as a test case for the more general, unsymmetric version. The unsymmetric version is currently under development. The report describes the hydrodynamic code, the symmetric shell code, the unsymmetric shell code, and the method of coupling. Test problems used to verify the shell codes and coupled codes are also reported. Work is continuing to verify both the symmetric and unsymmetric codes by making comparisons with experimental data and with theoretical test problems

  4. Blinded prospective evaluation of computer-based mechanistic schizophrenia disease model for predicting drug response.

    Directory of Open Access Journals (Sweden)

    Hugo Geerts

    Full Text Available The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published 'Quantitative Systems Pharmacology' computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D(2 antagonist and ocaperidone, a very high affinity dopamine D(2 antagonist, using only pharmacology and human positron emission tomography (PET imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS total score and the higher extra-pyramidal symptom (EPS liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development.

  5. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    International Nuclear Information System (INIS)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo

    2016-01-01

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment

  6. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo [Gachon University Gil Hospital, Incheon (Korea, Republic of)

    2016-06-15

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment.

  7. Adaptive Response in Female Fathead Minnows Exposed to an Aromatase Inhibitor: Computational Modeling of the Hypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  8. A comparison of computational models with and without genotyping for prediction of response to second-line HIV therapy

    NARCIS (Netherlands)

    Revell, A. D.; Boyd, M. A.; Wang, D.; Emery, S.; Gazzard, B.; Reiss, P.; van Sighem, A. I.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2014-01-01

    We compared the use of computational models developed with and without HIV genotype vs. genotyping itself to predict effective regimens for patients experiencing first-line virological failure. Two sets of models predicted virological response for 99 three-drug regimens for patients on a failing

  9. Social Skills Instruction for Urban Learners with Emotional and Behavioral Disorders: A Culturally Responsive and Computer-Based Intervention

    Science.gov (United States)

    Robinson-Ervin, Porsha; Cartledge, Gwendolyn; Musti-Rao, Shobana; Gibson, Lenwood, Jr.; Keyes, Starr E.

    2016-01-01

    This study examined the effects of culturally relevant/responsive, computer-based social skills instruction on the social skill acquisition and generalization of 6 urban African American sixth graders with emotional and behavioral disorders (EBD). A multiple-probe across participants design was used to evaluate the effects of the social skills…

  10. Becoming Technosocial Change Agents: Intersectionality and Culturally Responsive Pedagogies as Vital Resources for Increasing Girls' Participation in Computing

    Science.gov (United States)

    Ashcraft, Catherine; Eger, Elizabeth K.; Scott, Kimberly A.

    2017-01-01

    Drawing from our two-year ethnography, we juxtapose the experiences of two cohorts in one culturally responsive computing program, examining how the program fostered girls' emerging identities as technosocial change agents. In presenting this in-depth and up-close exploration, we simultaneously identify conditions that both facilitated and limited…

  11. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation

    Science.gov (United States)

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues—active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM50 and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  12. WE-FG-207B-02: Material Reconstruction for Spectral Computed Tomography with Detector Response Function

    International Nuclear Information System (INIS)

    Liu, J; Gao, H

    2016-01-01

    Purpose: Different from the conventional computed tomography (CT), spectral CT based on energy-resolved photon-counting detectors is able to provide the unprecedented material composition. However, an important missing piece for accurate spectral CT is to incorporate the detector response function (DRF), which is distorted by factors such as pulse pileup and charge-sharing. In this work, we propose material reconstruction methods for spectral CT with DRF. Methods: The polyenergetic X-ray forward model takes the DRF into account for accurate material reconstruction. Two image reconstruction methods are proposed: a direct method based on the nonlinear data fidelity from DRF-based forward model; a linear-data-fidelity based method that relies on the spectral rebinning so that the corresponding DRF matrix is invertible. Then the image reconstruction problem is regularized with the isotropic TV term and solved by alternating direction method of multipliers. Results: The simulation results suggest that the proposed methods provided more accurate material compositions than the standard method without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Conclusion: We have proposed material reconstruction methods for spectral CT with DRF, whichprovided more accurate material compositions than the standard methods without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Jiulong Liu and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).

  13. Computer-assisted detection (CAD) methodology for early detection of response to pharmaceutical therapy in tuberculosis patients

    Science.gov (United States)

    Lieberman, Robert; Kwong, Heston; Liu, Brent; Huang, H. K.

    2009-02-01

    The chest x-ray radiological features of tuberculosis patients are well documented, and the radiological features that change in response to successful pharmaceutical therapy can be followed with longitudinal studies over time. The patients can also be classified as either responsive or resistant to pharmaceutical therapy based on clinical improvement. We have retrospectively collected time series chest x-ray images of 200 patients diagnosed with tuberculosis receiving the standard pharmaceutical treatment. Computer algorithms can be created to utilize image texture features to assess the temporal changes in the chest x-rays of the tuberculosis patients. This methodology provides a framework for a computer-assisted detection (CAD) system that may provide physicians with the ability to detect poor treatment response earlier in pharmaceutical therapy. Early detection allows physicians to respond with more timely treatment alternatives and improved outcomes. Such a system has the potential to increase treatment efficacy for millions of patients each year.

  14. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    Science.gov (United States)

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  15. INTRANS. A computer code for the non-linear structural response analysis of reactor internals under transient loads

    International Nuclear Information System (INIS)

    Ramani, D.T.

    1977-01-01

    The 'INTRANS' system is a general purpose computer code, designed to perform linear and non-linear structural stress and deflection analysis of impacting or non-impacting nuclear reactor internals components coupled with reactor vessel, shield building and external as well as internal gapped spring support system. This paper describes in general a unique computational procedure for evaluating the dynamic response of reactor internals, descretised as beam and lumped mass structural system and subjected to external transient loads such as seismic and LOCA time-history forces. The computational procedure is outlined in the INTRANS code, which computes component flexibilities of a discrete lumped mass planar model of reactor internals by idealising an assemblage of finite elements consisting of linear elastic beams with bending, torsional and shear stiffnesses interacted with external or internal linear as well as non-linear multi-gapped spring support system. The method of analysis is based on the displacement method and the code uses the fourth-order Runge-Kutta numerical integration technique as a basis for solution of dynamic equilibrium equations of motion for the system. During the computing process, the dynamic response of each lumped mass is calculated at specific instant of time using well-known step-by-step procedure. At any instant of time then, the transient dynamic motions of the system are held stationary and based on the predicted motions and internal forces of the previous instant. From which complete response at any time-step of interest may then be computed. Using this iterative process, the relationship between motions and internal forces is satisfied step by step throughout the time interval

  16. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  17. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  18. Detection of advance item knowledge using response times in computer adaptive testing

    NARCIS (Netherlands)

    Meijer, R.R.; Sotaridona, Leonardo

    2006-01-01

    We propose a new method for detecting item preknowledge in a CAT based on an estimate of “effective response time” for each item. Effective response time is defined as the time required for an individual examinee to answer an item correctly. An unusually short response time relative to the expected

  19. Positron emission tomography/computed tomography and biomarkers for early treatment response evaluation in metastatic colon cancer

    DEFF Research Database (Denmark)

    Engelmann, Bodil E.; Loft, Annika; Kjær, Andreas

    2014-01-01

    BACKGROUND: Treatment options for metastatic colon cancer (mCC) are widening. We prospectively evaluated serial 2-deoxy-2-[18F]fluoro-d-glucose positron-emission tomography/computed tomography (PET/CT) and measurements of tissue inhibitor of metalloproteinases-1 (TIMP-1), carcinoembryonic antigen...... evaluated by PET/CT before treatment, after one and four treatment series. Morphological and metabolic response was independently assessed according to Response Evaluation Criteria in Solid Tumors and European Organization for Research and Treatment of Cancer PET criteria. Plasma TIMP-1, plasma u...

  20. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  1. Research on computer aided testing of pilot response to critical in-flight events

    Science.gov (United States)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  2. High fidelity computational characterization of the mechanical response of thermally aged polycarbonate

    Science.gov (United States)

    Zhang, Zesheng; Zhang, Lili; Jasa, John; Li, Wenlong; Gazonas, George; Negahban, Mehrdad

    2017-07-01

    A representative all-atom molecular dynamics (MD) system of polycarbonate (PC) is built and conditioned to capture and predict the behaviours of PC in response to a broad range of thermo-mechanical loadings for various thermal aging. The PC system is constructed to have a distribution of molecular weights comparable to a widely used commercial PC (LEXAN 9034), and thermally conditioned to produce models for aged and unaged PC. The MD responses of these models are evaluated through comparisons to existing experimental results carried out at much lower loading rates, but done over a broad range of temperatures and loading modes. These experiments include monotonic extension/compression/shear, unilaterally and bilaterally confined compression, and load-reversal during shear. It is shown that the MD simulations show both qualitative and quantitative similarity with the experimental response. The quantitative similarity is evaluated by comparing the dilatational response under bilaterally confined compression, the shear flow viscosity and the equivalent yield stress. The consistency of the in silico response to real laboratory experiments strongly suggests that the current PC models are physically and mechanically relevant and potentially can be used to investigate thermo-mechanical response to loading conditions that would not easily be possible. These MD models may provide valuable insight into the molecular sources of certain observations, and could possibly offer new perspectives on how to develop constitutive models that are based on better understanding the response of PC under complex loadings. To this latter end, the models are used to predict the response of PC to complex loading modes that would normally be difficult to do or that include characteristics that would be difficult to measure. These include the responses of unaged and aged PC to unilaterally confined extension/compression, cyclic uniaxial/shear loadings, and saw-tooth extension/compression/shear.

  3. Speed-accuracy trade-offs in computing spatial impulse responses for simulating medical ultrasound imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2001-01-01

    sampling frequency is unnecessary in the final signals, since the transducers used in medical ultrasound are band limited. Approaches to reduce the sampling frequency are, thus, needed to make efficient simulation programs. Field II uses time integration of the spatial impulse responses using a continuous......Medical ultrasound imaging can be simulated realistically using linear acoustics. One of the most powerful approaches is to employ spatial impulse responses. Hereby both emitted fields and pulse-echo responses from point scatterers can be determined. Also any kind of dynamic focusing...

  4. Computation-Guided Design of a Stimulus-Responsive Multienzyme Supramolecular Assembly.

    Science.gov (United States)

    Yang, Lu; Dolan, Elliott M; Tan, Sophia K; Lin, Tianyun; Sontag, Eduardo D; Khare, Sagar D

    2017-10-18

    The construction of stimulus-responsive supramolecular complexes of metabolic pathway enzymes, inspired by natural multienzyme assemblies (metabolons), provides an attractive avenue for efficient and spatiotemporally controllable one-pot biotransformations. We have constructed a phosphorylation- and optically responsive metabolon for the biodegradation of the environmental pollutant 1,2,3-trichloropropane. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. SONATINA-1: a computer program for seismic response analysis of column in HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1980-11-01

    An computer program SONATINA-1 for predicting the behavior of a prismatic high-temperature gas-cooled reactor (HTGR) core under seismic excitation has been developed. In this analytical method, blocks are treated as rigid bodies and are constrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions. Coulomb friction between blocks and between dowel holes and pins is also considered. A spring dashpot model is used for the collision process between adjacent blocks and between blocks and boundary walls. Analytical results are compared with experimental results and are found to be in good agreement. The computer program can be used to predict the behavior of the HTGR core under seismic excitation. (author)

  6. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy

    OpenAIRE

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, Mar?a J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2016-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Se...

  7. Computational micromechanics analysis of electron hopping and interfacial damage induced piezoresistive response in carbon nanotube-polymer nanocomposites

    International Nuclear Information System (INIS)

    Chaurasia, A K; Seidel, G D; Ren, X

    2014-01-01

    Carbon nanotube (CNT)-polymer nanocomposites have been observed to exhibit an effective macroscale piezoresistive response, i.e., change in macroscale resistivity when subjected to applied deformation. The macroscale piezoresistive response of CNT-polymer nanocomposites leads to deformation/strain sensing capabilities. It is believed that the nanoscale phenomenon of electron hopping is the major driving force behind the observed macroscale piezoresistivity of such nanocomposites. Additionally, CNT-polymer nanocomposites provide damage sensing capabilities because of local changes in electron hopping pathways at the nanoscale because of initiation/evolution of damage. The primary focus of the current work is to explore the effect of interfacial separation and damage at the nanoscale CNT-polymer interface on the effective macroscale piezoresistive response. Interfacial separation and damage are allowed to evolve at the CNT-polymer interface through coupled electromechanical cohesive zones, within a finite element based computational micromechanics framework, resulting in electron hopping based current density across the separated CNT-polymer interface. The macroscale effective material properties and gauge factors are evaluated using micromechanics techniques based on electrostatic energy equivalence. The impact of the electron hopping mechanism, nanoscale interface separation and damage evolution on the effective nanocomposite electrostatic and piezoresistive response is studied in comparison with the perfectly bonded interface. The effective electrostatic/piezoresistive response for the perfectly bonded interface is obtained based on a computational micromechanics model developed in the authors’ earlier work. It is observed that the macroscale effective gauge factors are highly sensitive to strain induced formation/disruption of electron hopping pathways, interface separation and the initiation/evolution of interfacial damage. (paper)

  8. Computed neutron response of spherical moderator-detector systems for radiation protection monitoring

    International Nuclear Information System (INIS)

    Dhairyawan, M.P.

    1979-01-01

    Neutrons of energies below 500 keV are important from the point of view of radiation protection of personnel working around reactors. However, as no neutron sources are available at lower energies, no measured values of neutron energy response are available between thermal and 0.5 MeV (but for Sb-Be source at 24 keV). The response functions in this range are, therefore, arrived at theoretically. After giving a comprehensive review of the work done in the field of response of moderated neutron detectors, a Monte Carlo method developed for this purpose is described and used to calculate energy response functions of the two spherical moderator-detector systems, namely, one using a central BF 3 counter and the other using 6 LiI(Eu) scintillator of 0.490 dia crystal. The polythene sphere diameter ranged from 2'' to 12''. The results obtained follow the trend predicted by other calculations and experiments, but are a definite improvement over them, because the most recent data on cross sections and angular distribution are used and the opacity of the detector i.e. the presence and size of the detector within the moderator is taken into account in the present calculations. The reasons for the discrepancies in the present results and those obtained earlier by other methods are discussed. The response of the Leake counter arrived at by the present method agrees very well with experimental calibration. (M.G.B.)

  9. A computational model of inferior colliculus responses to amplitude modulated sounds in young and aged rats

    Directory of Open Access Journals (Sweden)

    Cal Francis Rabang

    2012-11-01

    Full Text Available The inferior colliculus (IC receives ascending excitatory and inhibitory inputs from multiple sources, but how these auditory inputs converge to generate IC spike patterns is poorly understood. Simulating patterns of in vivo spike train data from cellular and synaptic models creates a powerful framework to identify factors that contribute to changes in IC responses, such as those resulting in age-related loss of temporal processing. A conductance-based single neuron IC model was constructed, and its responses were compared to those observed during in vivo IC recordings in rats. IC spike patterns were evoked using amplitude-modulated (AM tone or noise carriers at 20-40 dB above threshold and were classified as low-pass, band-pass, band-reject, all-pass, or complex based on their rate modulation transfer function (rMTF tuning shape. Their temporal modulation transfer functions (tMTFs were also measured. These spike patterns provided experimental measures of rate, vector strength and firing pattern for comparison with model outputs. Patterns of excitatory and inhibitory synaptic convergence to IC neurons were based on anatomical studies and generalized input tuning for modulation frequency. Responses of modeled ascending inputs were derived from experimental data from previous studies. Adapting and sustained IC intrinsic models were created, with adaptation created via calcium-activated potassium currents. Short-term synaptic plasticity was incorporated into the model in the form of synaptic depression, which was shown to have a substantial effect on the magnitude and time course of the IC response. The most commonly observed IC response subtypes were recreated and enabled dissociation of inherited response properties from those that were generated in IC. Furthermore, the model was used to make predictions about the consequences of reduction in inhibition for age-related loss of temporal processing due to a reduction in GABA seen anatomically with

  10. The Coda of the Transient Response in a Sensitive Cochlea: A Computational Modeling Study.

    Directory of Open Access Journals (Sweden)

    Yizeng Li

    2016-07-01

    Full Text Available In a sensitive cochlea, the basilar membrane response to transient excitation of any kind-normal acoustic or artificial intracochlear excitation-consists of not only a primary impulse but also a coda of delayed secondary responses with varying amplitudes but similar spectral content around the characteristic frequency of the measurement location. The coda, sometimes referred to as echoes or ringing, has been described as a form of local, short term memory which may influence the ability of the auditory system to detect gaps in an acoustic stimulus such as speech. Depending on the individual cochlea, the temporal gap between the primary impulse and the following coda ranges from once to thrice the group delay of the primary impulse (the group delay of the primary impulse is on the order of a few hundred microseconds. The coda is physiologically vulnerable, disappearing when the cochlea is compromised even slightly. The multicomponent sensitive response is not yet completely understood. We use a physiologically-based, mathematical model to investigate (i the generation of the primary impulse response and the dependence of the group delay on the various stimulation methods, (ii the effect of spatial perturbations in the properties of mechanically sensitive ion channels on the generation and separation of delayed secondary responses. The model suggests that the presence of the secondary responses depends on the wavenumber content of a perturbation and the activity level of the cochlea. In addition, the model shows that the varying temporal gaps between adjacent coda seen in experiments depend on the individual profiles of perturbations. Implications for non-invasive cochlear diagnosis are also discussed.

  11. Computer-aided global breast MR image feature analysis for prediction of tumor response to chemotherapy: performance assessment

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Zheng, Bin; Cheng, Samuel

    2016-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) has been used increasingly in breast cancer diagnosis and assessment of cancer treatment efficacy. In this study, we applied a computer-aided detection (CAD) scheme to automatically segment breast regions depicting on MR images and used the kinetic image features computed from the global breast MR images acquired before neoadjuvant chemotherapy to build a new quantitative model to predict response of the breast cancer patients to the chemotherapy. To assess performance and robustness of this new prediction model, an image dataset involving breast MR images acquired from 151 cancer patients before undergoing neoadjuvant chemotherapy was retrospectively assembled and used. Among them, 63 patients had "complete response" (CR) to chemotherapy in which the enhanced contrast levels inside the tumor volume (pre-treatment) was reduced to the level as the normal enhanced background parenchymal tissues (post-treatment), while 88 patients had "partially response" (PR) in which the high contrast enhancement remain in the tumor regions after treatment. We performed the studies to analyze the correlation among the 22 global kinetic image features and then select a set of 4 optimal features. Applying an artificial neural network trained with the fusion of these 4 kinetic image features, the prediction model yielded an area under ROC curve (AUC) of 0.83+/-0.04. This study demonstrated that by avoiding tumor segmentation, which is often difficult and unreliable, fusion of kinetic image features computed from global breast MR images without tumor segmentation can also generate a useful clinical marker in predicting efficacy of chemotherapy.

  12. Computed tomographic detection of sinusitis responsible for intracranial and extracranial infections

    International Nuclear Information System (INIS)

    Carter, B.L.; Bankoff, M.S.; Fisk, J.D.

    1983-01-01

    Computed tomography (CT) is now used extensively for the evaluation of orbital, facial, and intracranial infections. Nine patients are presented to illustrate the importance of detecting underlying and unsuspected sinusitis. Prompt treatment of the sinusitis is essential to minimize the morbidity and mortality associated with complications such as brain abscess, meningitis, orbital cellulitis, and osteomyelitis. A review of the literature documents the persistence of these complications despite the widespread use of antibiotic therapy. Recognition of the underlying sinusitis is now possible with CT if the region of the sinuses is included and bone-window settings are used during the examination of patients with orbital and intracranial infection

  13. ROCKING. A computer program for seismic response analysis of radioactive materials transport AND/OR storage casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1995-11-01

    The computer program ROCKING has been developed for seismic response analysis, which includes rocking and sliding behavior, of radioactive materials transport and/or storage casks. Main features of ROCKING are as follows; (1) Cask is treated as a rigid body. (2) Rocking and sliding behavior are considered. (3) Impact forces are represented by the spring dashpot model located at impact points. (4) Friction force is calculated at interface between a cask and a floor. (5) Forces of wire ropes against tip-over work only as tensile loads. In the paper, the calculation model, the calculation equations, validity calculations and user's manual are shown. (author)

  14. COMPUGIRLS' Standpoint: Culturally Responsive Computing and Its Effect on Girls of Color

    Science.gov (United States)

    Scott, Kimberly A.; White, Mary Aleta

    2013-01-01

    This article investigates the motivations of African American and Latino girls ("N" = 41) who navigate urban Southwest school districts during the day, but voluntarily attend a 2-year, culturally responsive multimedia program after school and into the summer. Understanding that girls from economically disadvantaged settings are indeed…

  15. An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test

    Science.gov (United States)

    Kahraman, Nilüfer

    2014-01-01

    Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local…

  16. Computing level-impulse responses of log-specified VAR systems

    NARCIS (Netherlands)

    Wieringa, J.E.; Horvath, C.

    2005-01-01

    Impulse response functions (IRFs) are often used to analyze the dynamic behavior of a vector autoregressive (VAR) system. In many applications of VAR modelling, the variables are log-transformed before the model is estimated. If this is the case, the results of the IRFs do not have a direct

  17. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  18. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  19. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II) user's manual

    International Nuclear Information System (INIS)

    David P. Colton

    2007-01-01

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time

  20. Integral transport computation of gamma detector response with the CPM2 code

    International Nuclear Information System (INIS)

    Jones, D.B.

    1989-12-01

    CPM-2 Version 3 is an enhanced version of the CPM-2 lattice physics computer code which supports the capabilities to (1) perform a two-dimensional gamma flux calculation and (2) perform Restart/Data file maintenance operations. The Gamma Calculation Module implemented in CPM-2 was first developed for EPRI in the CASMO-1 computer code by Studsvik Energiteknik under EPRI Agreement RP2352-01. The gamma transport calculation uses the CPM-HET code module to calculate the transport of gamma rays in two dimensions in a mixed cylindrical-rectangular geometry, where the basic fuel assembly and component regions are maintained in a rectangular geometry, but the fuel pins are represented as cylinders within a square pin cell mesh. Such a capability is needed to represent gamma transport in an essentially transparent medium containing spatially distributed ''black'' cylindrical pins. Under a subcontract to RP2352-01, RPI developed the gamma production and gamma interaction library used for gamma calculation. The CPM-2 gamma calculation was verified against reference results generated by Studsvik using the CASMO-1 program. The CPM-2 Restart/Data file maintenance capabilities provide the user with options to copy files between Restart/Data tapes and to purge files from the Restart/Data tapes

  1. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Science.gov (United States)

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  2. Computation of USGS Soil UHS and Comparison to NEHRP and PC 1 Seismic Response; FINAL

    International Nuclear Information System (INIS)

    Lee, R.C.

    2000-01-01

    Recently, new site-specific seismic design response spectra were developed for Savannah River Site (SRS) performance category (PC) 1,2,3 and 4 structures, systems and components (SSCs) (WSRC, 1997, 1998) in accordance with DOE Standards. The lower performance categories (PC1 and PC2) site-specific design basis were not compatible with the response spectrum generated if building code guidelines were used (National Earthquake Hazard Reduction Program Recommended Provisions for Seismic Regulations for New Building, (NEHRP), 1997). These differences in criteria and approach should be documented and understood. Thus, Westinghouse Savannah River Company (WSRC) initiated this study to evaluate the difference between the building code hazard assessment (NEHRP) and the site-specific hazard evaluations used for SRS design

  3. Performance Evaluation of Residential Demand Response Based on a Modified Fuzzy VIKOR and Scalable Computing Method

    Directory of Open Access Journals (Sweden)

    Jun Dong

    2018-04-01

    Full Text Available For better utilizing renewable energy resources and improving the sustainability of power systems, demand response is widely applied in China, especially in recent decades. Considering the massive potential flexible resources in the residential sector, demand response programs are able to achieve significant benefits. This paper proposes an effective performance evaluation framework for such programs aimed at residential customers. In general, the evaluation process will face multiple criteria and some uncertain factors. Therefore, we combine the multi-criteria decision making concept and fuzzy set theory to accomplish the model establishment. By introducing trapezoidal fuzzy numbers into the Vlsekriterijumska Optimizacijia I Kompromisno Resenje (VIKOR method, the evaluation model can effectively deal with the subjection and fuzziness of experts’ opinions. Furthermore, we ameliorate the criteria weight determination procedure of traditional models via combining the fuzzy Analytic Hierarchy Process and Shannon entropy method, which can incorporate objective information and subjective judgments. Finally, the proposed evaluation framework is verified by the empirical analysis of five demand response projects in Chinese residential areas. The results give a valid performance ranking of the five alternatives and indicate that more attention should be paid to the criteria affiliated with technology level and economy benefits. In addition, a series of sensitivity analyses are conducted to examine the validity and effectiveness of the established evaluation framework and results. The study improves traditional multi-criteria decision making method VIKOR by introducing trapezoidal fuzzy numbers and combination weighing technique, which can provide an effective mean for performance evaluation of residential demand response programs in a fuzzy environment.

  4. A study of the responses of neutron dose equivalent survey meters with computer codes

    International Nuclear Information System (INIS)

    Sartori, D.E.; Beer, G.P. de

    1983-01-01

    The ANISN and DOT discrete-ordinates radiation transport codes for one and two dimensions have been proved as effective and simple techniques to study the response of dose equivalent neutron detectors. Comparisons between results of an experimental calibration of the Harwell 95/0075 survey meter and calculated results rendered satisfactory agreement, considering the different techniques and sources of error involved. Possible improvements in the methods and designs and causes of error are discussed. (author)

  5. Computational modeling predicts the ionic mechanism of late-onset responses in Unipolar Brush Cells

    Directory of Open Access Journals (Sweden)

    Sathyaa eSubramaniyam

    2014-08-01

    Full Text Available Unipolar Brush Cells (UBCs have been suggested to have a strong impact on cerebellar granular layer functioning, yet the corresponding cellular mechanisms remain poorly understood. UBCs have recently been reported to generate, in addition to early-onset glutamatergic synaptic responses, a late-onset response (LOR composed of a slow depolarizing ramp followed by a spike burst (Locatelli et al., 2013. The LOR activates as a consequence of synaptic activity and involves an intracellular cascade modulating H- and TRP-current gating. In order to assess the LOR mechanisms, we have developed a UBC multi-compartmental model (including soma, dendrite, initial segment and axon incorporating biologically realistic representations of ionic currents and a generic coupling mechanism regulating TRP and H channel gating. The model finely reproduced UBC responses to current injection, including a low-threshold spike sustained by CaLVA currents, a persistent discharge sustained by CaHVA currents, and a rebound burst following hyperpolarization sustained by H- and CaLVA-currents. Moreover, the model predicted that H- and TRP-current regulation was necessary and sufficient to generate the LOR and its dependence on the intensity and duration of mossy fiber activity. Therefore, the model showed that, using a basic set of ionic channels, UBCs generate a rich repertoire of delayed bursts, which could take part to the formation of tunable delay-lines in the local microcircuit.

  6. Computational modeling predicts the ionic mechanism of late-onset responses in unipolar brush cells.

    Science.gov (United States)

    Subramaniyam, Sathyaa; Solinas, Sergio; Perin, Paola; Locatelli, Francesca; Masetto, Sergio; D'Angelo, Egidio

    2014-01-01

    Unipolar Brush Cells (UBCs) have been suggested to play a critical role in cerebellar functioning, yet the corresponding cellular mechanisms remain poorly understood. UBCs have recently been reported to generate, in addition to early-onset glutamate receptor-dependent synaptic responses, a late-onset response (LOR) composed of a slow depolarizing ramp followed by a spike burst (Locatelli et al., 2013). The LOR activates as a consequence of synaptic activity and involves an intracellular cascade modulating H- and TRP-current gating. In order to assess the LOR mechanisms, we have developed a UBC multi-compartmental model (including soma, dendrite, initial segment, and axon) incorporating biologically realistic representations of ionic currents and a cytoplasmic coupling mechanism regulating TRP and H channel gating. The model finely reproduced UBC responses to current injection, including a burst triggered by a low-threshold spike (LTS) sustained by CaLVA currents, a persistent discharge sustained by CaHVA currents, and a rebound burst following hyperpolarization sustained by H- and CaLVA-currents. Moreover, the model predicted that H- and TRP-current regulation was necessary and sufficient to generate the LOR and its dependence on the intensity and duration of mossy fiber activity. Therefore, the model showed that, using a basic set of ionic channels, UBCs generate a rich repertoire of bursts, which could effectively implement tunable delay-lines in the local microcircuit.

  7. Transcription-based prediction of response to IFNbeta using supervised computational methods.

    Directory of Open Access Journals (Sweden)

    Sergio E Baranzini

    2005-01-01

    Full Text Available Changes in cellular functions in response to drug therapy are mediated by specific transcriptional profiles resulting from the induction or repression in the activity of a number of genes, thereby modifying the preexisting gene activity pattern of the drug-targeted cell(s. Recombinant human interferon beta (rIFNbeta is routinely used to control exacerbations in multiple sclerosis patients with only partial success, mainly because of adverse effects and a relatively large proportion of nonresponders. We applied advanced data-mining and predictive modeling tools to a longitudinal 70-gene expression dataset generated by kinetic reverse-transcription PCR from 52 multiple sclerosis patients treated with rIFNbeta to discover higher-order predictive patterns associated with treatment outcome and to define the molecular footprint that rIFNbeta engraves on peripheral blood mononuclear cells. We identified nine sets of gene triplets whose expression, when tested before the initiation of therapy, can predict the response to interferon beta with up to 86% accuracy. In addition, time-series analysis revealed potential key players involved in a good or poor response to interferon beta. Statistical testing of a random outcome class and tolerance to noise was carried out to establish the robustness of the predictive models. Large-scale kinetic reverse-transcription PCR, coupled with advanced data-mining efforts, can effectively reveal preexisting and drug-induced gene expression signatures associated with therapeutic effects.

  8. Sensorimotor rhythm-based brain-computer interface training: the impact on motor cortical responsiveness

    Science.gov (United States)

    Pichiorri, F.; De Vico Fallani, F.; Cincotti, F.; Babiloni, F.; Molinari, M.; Kleih, S. C.; Neuper, C.; Kübler, A.; Mattia, D.

    2011-04-01

    The main purpose of electroencephalography (EEG)-based brain-computer interface (BCI) technology is to provide an alternative channel to support communication and control when motor pathways are interrupted. Despite the considerable amount of research focused on the improvement of EEG signal detection and translation into output commands, little is known about how learning to operate a BCI device may affect brain plasticity. This study investigated if and how sensorimotor rhythm-based BCI training would induce persistent functional changes in motor cortex, as assessed with transcranial magnetic stimulation (TMS) and high-density EEG. Motor imagery (MI)-based BCI training in naïve participants led to a significant increase in motor cortical excitability, as revealed by post-training TMS mapping of the hand muscle's cortical representation; peak amplitude and volume of the motor evoked potentials recorded from the opponens pollicis muscle were significantly higher only in those subjects who develop a MI strategy based on imagination of hand grasping to successfully control a computer cursor. Furthermore, analysis of the functional brain networks constructed using a connectivity matrix between scalp electrodes revealed a significant decrease in the global efficiency index for the higher-beta frequency range (22-29 Hz), indicating that the brain network changes its topology with practice of hand grasping MI. Our findings build the neurophysiological basis for the use of non-invasive BCI technology for monitoring and guidance of motor imagery-dependent brain plasticity and thus may render BCI a viable tool for post-stroke rehabilitation.

  9. Computer program for analysis of hemodynamic response to head-up tilt test

    Science.gov (United States)

    ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor

    2014-11-01

    The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.

  10. Clinical responses to ERK inhibition in BRAFV600E-mutant colorectal cancer predicted using a computational model.

    Science.gov (United States)

    Kirouac, Daniel C; Schaefer, Gabriele; Chan, Jocelyn; Merchant, Mark; Orr, Christine; Huang, Shih-Min A; Moffat, John; Liu, Lichuan; Gadkar, Kapil; Ramanujan, Saroja

    2017-01-01

    Approximately 10% of colorectal cancers harbor BRAF V600E mutations, which constitutively activate the MAPK signaling pathway. We sought to determine whether ERK inhibitor (GDC-0994)-containing regimens may be of clinical benefit to these patients based on data from in vitro (cell line) and in vivo (cell- and patient-derived xenograft) studies of cetuximab (EGFR), vemurafenib (BRAF), cobimetinib (MEK), and GDC-0994 (ERK) combinations. Preclinical data was used to develop a mechanism-based computational model linking cell surface receptor (EGFR) activation, the MAPK signaling pathway, and tumor growth. Clinical predictions of anti-tumor activity were enabled by the use of tumor response data from three Phase 1 clinical trials testing combinations of EGFR, BRAF, and MEK inhibitors. Simulated responses to GDC-0994 monotherapy (overall response rate = 17%) accurately predicted results from a Phase 1 clinical trial regarding the number of responding patients (2/18) and the distribution of tumor size changes ("waterfall plot"). Prospective simulations were then used to evaluate potential drug combinations and predictive biomarkers for increasing responsiveness to MEK/ERK inhibitors in these patients.

  11. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  12. A novel computational approach of image analysis to quantify behavioural response to heat shock in Chironomus Ramosus larvae (Diptera: Chironomidae

    Directory of Open Access Journals (Sweden)

    Bimalendu B. Nath

    2015-07-01

    Full Text Available All living cells respond to temperature stress through coordinated cellular, biochemical and molecular events known as “heat shock response” and its genetic basis has been found to be evolutionarily conserved. Despite marked advances in stress research, this ubiquitous heat shock response has never been analysed quantitatively at the whole organismal level using behavioural correlates. We have investigated behavioural response to heat shock in a tropical midge Chironomus ramosus Chaudhuri, Das and Sublette. The filter-feeding aquatic Chironomus larvae exhibit characteristic undulatory movement. This innate pattern of movement was taken as a behavioural parameter in the present study. We have developed a novel computer-aided image analysis tool “Chiro” for the quantification of behavioural responses to heat shock. Behavioural responses were quantified by recording the number of undulations performed by each larva per unit time at a given ambient temperature. Quantitative analysis of undulation frequency was carried out and this innate behavioural pattern was found to be modulated as a function of ambient temperature. Midge larvae are known to be bioindicators of aquatic environments. Therefore, the “Chiro” technique can be tested using other potential biomonitoring organisms obtained from natural aquatic habitats using undulatory motion as a behavioural parameter.

  13. Efficient sparse matrix-matrix multiplication for computing periodic responses by shooting method on Intel Xeon Phi

    Science.gov (United States)

    Stoykov, S.; Atanassov, E.; Margenov, S.

    2016-10-01

    Many of the scientific applications involve sparse or dense matrix operations, such as solving linear systems, matrix-matrix products, eigensolvers, etc. In what concerns structural nonlinear dynamics, the computations of periodic responses and the determination of stability of the solution are of primary interest. Shooting method iswidely used for obtaining periodic responses of nonlinear systems. The method involves simultaneously operations with sparse and dense matrices. One of the computationally expensive operations in the method is multiplication of sparse by dense matrices. In the current work, a new algorithm for sparse matrix by dense matrix products is presented. The algorithm takes into account the structure of the sparse matrix, which is obtained by space discretization of the nonlinear Mindlin's plate equation of motion by the finite element method. The algorithm is developed to use the vector engine of Intel Xeon Phi coprocessors. It is compared with the standard sparse matrix by dense matrix algorithm and the one developed by Intel MKL and it is shown that by considering the properties of the sparse matrix better algorithms can be developed.

  14. Fast neutron detection with germanium detectors: computation of response functions for the 692 keV inelastic scattering peak

    International Nuclear Information System (INIS)

    Fehrenbacher, G.; Meckbach, R.; Paretzke, H.G.

    1996-01-01

    The dependence of the shape of the right-sided broadening of the inelastic scattering peak at 692 keV in the pulse-height distribution measured with a Ge detector in fast neutron fields on the energy of the incident neutrons has been analyzed. A model incorporating the process contributing to the energy deposition that engender the peak, including the partitioning of the energy deposition by the Ge recoils, was developed. With a Monte Carlo code based on this model, the detector response associated with this peak was computed and compared with results of measurements with quasi-monoenergetic neutrons for energies between 0.88 and 2.1 MeV. A set of 80 response functions for neutron energies in the range from the reaction threshold at 0.7 to 6 MeV was computed, which will serve as a starting point for methods, which aim at obtaining information on the spectral distribution of fast neutron fields for this energy range from measurements with a Ge detector. (orig.)

  15. Computational methods for predicting the response of critical as-built infrastructure to dynamic loads (architectural surety)

    Energy Technology Data Exchange (ETDEWEB)

    Preece, D.S.; Weatherby, J.R.; Attaway, S.W.; Swegle, J.W.; Matalucci, R.V.

    1998-06-01

    Coupled blast-structural computational simulations using supercomputer capabilities will significantly advance the understanding of how complex structures respond under dynamic loads caused by explosives and earthquakes, an understanding with application to the surety of both federal and nonfederal buildings. Simulation of the effects of explosives on structures is a challenge because the explosive response can best be simulated using Eulerian computational techniques and structural behavior is best modeled using Lagrangian methods. Due to the different methodologies of the two computational techniques and code architecture requirements, they are usually implemented in different computer programs. Explosive and structure modeling in two different codes make it difficult or next to impossible to do coupled explosive/structure interaction simulations. Sandia National Laboratories has developed two techniques for solving this problem. The first is called Smoothed Particle Hydrodynamics (SPH), a relatively new gridless method comparable to Eulerian, that is especially suited for treating liquids and gases such as those produced by an explosive. The SPH capability has been fully implemented into the transient dynamics finite element (Lagrangian) codes PRONTO-2D and -3D. A PRONTO-3D/SPH simulation of the effect of a blast on a protective-wall barrier is presented in this paper. The second technique employed at Sandia National Laboratories uses a relatively new code called ALEGRA which is an ALE (Arbitrary Lagrangian-Eulerian) wave code with specific emphasis on large deformation and shock propagation. ALEGRA is capable of solving many shock-wave physics problems but it is especially suited for modeling problems involving the interaction of decoupled explosives with structures.

  16. Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis

    Directory of Open Access Journals (Sweden)

    F. Joos

    2013-03-01

    Full Text Available The responses of carbon dioxide (CO2 and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP and Global Temperature change Potential (GTP, to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%. The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence lies within the range of (68 to 117 × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and

  17. Understanding the Value of a Computer Emergency Response Capability for Nuclear Security

    Energy Technology Data Exchange (ETDEWEB)

    Gasper, Peter Donald [Idaho National Laboratory; Rodriguez, Julio Gallardo [Idaho National Laboratory

    2015-06-01

    The international nuclear community has a great understanding of the physical security needs relating to the prevention, detection, and response of malicious acts associated with nuclear facilities and radioactive material. International Atomic Energy Agency (IAEA) Nuclear Security Recommendations (INFCIRC_225_Rev 5) outlines specific guidelines and recommendations for implementing and maintaining an organization’s nuclear security posture. An important element for inclusion into supporting revision 5 is the establishment of a “Cyber Emergency Response Team (CERT)” focused on the international communities cybersecurity needs to maintain a comprehensive nuclear security posture. Cybersecurity and the importance of nuclear cybersecurity require that there be a specific focus on developing an International Nuclear CERT (NS-CERT). States establishing contingency plans should have an understanding of the cyber threat landscape and the potential impacts to systems in place to protect and mitigate malicious activities. This paper will outline the necessary components, discuss the relationships needed within the international community, and outline a process by which the NS-CERT identifies, collects, processes, and reports critical information in order to establish situational awareness (SA) and support decision-making

  18. Computation of the Response Surface in the Tensor Train data format

    KAUST Repository

    Dolgov, Sergey; Khoromskij, Boris N.; Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    We apply the Tensor Train (TT) approximation to construct the Polynomial Chaos Expansion (PCE) of a random field, and solve the stochastic elliptic diffusion PDE with the stochastic Galerkin discretization. We compare two strategies of the polynomial chaos expansion: sparse and full polynomial (multi-index) sets. In the full set, the polynomial orders are chosen independently in each variable, which provides higher flexibility and accuracy. However, the total amount of degrees of freedom grows exponentially with the number of stochastic coordinates. To cope with this curse of dimensionality, the data is kept compressed in the TT decomposition, a recurrent low-rank factorization. PCE computations on sparse grids sets are extensively studied, but the TT representation for PCE is a novel approach that is investigated in this paper. We outline how to deduce the PCE from the covariance matrix, assemble the Galerkin operator, and evaluate some post-processing (mean, variance, Sobol indices), staying within the low-rank framework. The most demanding are two stages. First, we interpolate PCE coefficients in the TT format using a few number of samples, which is performed via the block cross approximation method. Second, we solve the discretized equation (large linear system) via the alternating minimal energy algorithm. In the numerical experiments we demonstrate that the full expansion set encapsulated in the TT format is indeed preferable in cases when high accuracy and high polynomial orders are required.

  19. Semiquantitative dynamic computed tomography to predict response to anti-platelet therapy in acute cerebral infarction

    International Nuclear Information System (INIS)

    Chokyu, K.; Shimizu, K.; Fukumoto, M.; Mori, T.; Mokudai, T.; Mori, K.

    2002-01-01

    We investigated whether dynamic computed tomography (CT) in patients with acute cerebral infarction could identify patients likely to respond to anti-platelet therapy. Seventy patients underwent semiquantitative dynamic CT within 6 h as well as cerebral angiography. All then received anti-platelet therapy with a thromboxane A2 synthetase inhibitor. Peak value (pv) and time-to-peak (tp) (time-density curves) for the Sylvian fissure were extracted from dynamic CT data and standardizing interpatient data, two indices, PV/TP index and TP index, were prepared following a standard semiquantitative manner. Both PV/TP index and TP index were effective in discriminating between 48 responders (modified Rankin scale (mRS): 0 to 2) and 22 non-responders (mRS: 3 to 5, or death: 6; both P 1.1) and non-compensated rCBF. Intermediate PV/TP values could not predict outcome. Dynamic CT prior to therapy can identify patients with acute cerebral infarction who are treatable with anti-platelet therapy alone. (orig.)

  20. Computation of the Response Surface in the Tensor Train data format

    KAUST Repository

    Dolgov, Sergey

    2014-06-11

    We apply the Tensor Train (TT) approximation to construct the Polynomial Chaos Expansion (PCE) of a random field, and solve the stochastic elliptic diffusion PDE with the stochastic Galerkin discretization. We compare two strategies of the polynomial chaos expansion: sparse and full polynomial (multi-index) sets. In the full set, the polynomial orders are chosen independently in each variable, which provides higher flexibility and accuracy. However, the total amount of degrees of freedom grows exponentially with the number of stochastic coordinates. To cope with this curse of dimensionality, the data is kept compressed in the TT decomposition, a recurrent low-rank factorization. PCE computations on sparse grids sets are extensively studied, but the TT representation for PCE is a novel approach that is investigated in this paper. We outline how to deduce the PCE from the covariance matrix, assemble the Galerkin operator, and evaluate some post-processing (mean, variance, Sobol indices), staying within the low-rank framework. The most demanding are two stages. First, we interpolate PCE coefficients in the TT format using a few number of samples, which is performed via the block cross approximation method. Second, we solve the discretized equation (large linear system) via the alternating minimal energy algorithm. In the numerical experiments we demonstrate that the full expansion set encapsulated in the TT format is indeed preferable in cases when high accuracy and high polynomial orders are required.

  1. Correlation of Computed Tomography Imaging Features With Pain Response in Patients With Spine Metastases After Radiation Therapy

    International Nuclear Information System (INIS)

    Mitera, Gunita; Probyn, Linda; Ford, Michael; Donovan, Andrea; Rubenstein, Joel; Finkelstein, Joel; Christakis, Monique; Zhang, Liying; Campos, Sarah; Culleton, Shaelyn; Nguyen, Janet; Sahgal, Arjun; Barnes, Elizabeth; Tsao, May; Danjoux, Cyril; Holden, Lori; Yee, Albert; Khan, Luluel; Chow, Edward

    2011-01-01

    Purpose: To correlate computed tomography (CT) imaging features of spinal metastases with pain relief after radiotherapy (RT). Methods and Materials: Thirty-three patients receiving computed tomography (CT)-simulated RT for spinal metastases in an outpatient palliative RT clinic from January 2007 to October 2008 were retrospectively reviewed. Forty spinal metastases were evaluated. Pain response was rated using the International Bone Metastases Consensus Working Party endpoints. Three musculoskeletal radiologists and two orthopaedic surgeons evaluated CT features, including osseous and soft tissue tumor extent, presence of a pathologic fracture, severity of vertebral height loss, and presence of kyphosis. Results: The mean patient age was 69 years; 24 were men and 9 were women. The mean worst pain score was 7/10, and the mean total daily oral morphine equivalent was 77.3 mg. Treatment doses included 8 Gy in one fraction (22/33), 20 Gy in five fractions (10/33), and 20 Gy in eight fractions (1/33). The CT imaging appearance of spinal metastases included vertebral body involvement (40/40), pedicle involvement (23/40), and lamina involvement (18/40). Soft tissue component (10/40) and nerve root compression (9/40) were less common. Pathologic fractures existed in 11/40 lesions, with resultant vertebral body height loss in 10/40 and kyphosis in 2/40 lesions. At months 1, 2, and 3 after RT, 18%, 69%, and 70% of patients experienced pain relief. Pain response was observed with various CT imaging features. Conclusions: Pain response after RT did not differ in patients with and without pathologic fracture, kyphosis, or any other CT features related to extent of tumor involvement. All patients with painful spinal metastases may benefit from palliative RT.

  2. Hydrologic Response to Climate Change: Missing Precipitation Data Matters for Computed Timing Trends

    Science.gov (United States)

    Daniels, B.

    2016-12-01

    This work demonstrates the derivation of climate timing statistics and applying them to determine resulting hydroclimate impacts. Long-term daily precipitation observations from 50 California stations were used to compute climate trends of precipitation event Intensity, event Duration and Pause between events. Each precipitation event trend was then applied as input to a PRMS hydrology model which showed hydrology changes to recharge, baseflow, streamflow, etc. An important concern was precipitation uncertainty induced by missing observation values and causing errors in quantification of precipitation trends. Many standard statistical techniques such as ARIMA and simple endogenous or even exogenous imputation were applied but failed to help resolve these uncertainties. What helped resolve these uncertainties was use of multiple imputation techniques. This involved fitting of Weibull probability distributions to multiple imputed values for the three precipitation trends.Permutation resampling techniques using Monte Carlo processing were then applied to the multiple imputation values to derive significance p-values for each trend. Significance at the 95% level for Intensity was found for 11 of the 50 stations, Duration from 16 of the 50, and Pause from 19, of which 12 were 99% significant. The significance weighted trends for California are Intensity -4.61% per decade, Duration +3.49% per decade, and Pause +3.58% per decade. Two California basins with PRMS hydrologic models were studied: Feather River in the northern Sierra Nevada mountains and the central coast Soquel-Aptos. Each local trend was changed without changing the other trends or the total precipitation. Feather River Basin's critical supply to Lake Oroville and the State Water Project benefited from a total streamflow increase of 1.5%. The Soquel-Aptos Basin water supply was impacted by a total groundwater recharge decrease of -7.5% and streamflow decrease of -3.2%.

  3. [Positron emission tomography combined with computed tomography in the initial evaluation and response assessment in primary central nervous system lymphoma].

    Science.gov (United States)

    Mercadal, Santiago; Cortés-Romera, Montserrat; Vélez, Patricia; Climent, Fina; Gámez, Cristina; González-Barca, Eva

    2015-06-08

    To evaluate the role of positron emission tomography combined with computed tomography (PET-CT) in the initial evaluation and response assessment in primary central nervous system lymphoma (PCNSL). Fourteen patients (8 males) with a median age 59.5 years diagnosed of PCNSL. A brain PET-CT and magnetic resonance imaging (MRI) were performed in the initial evaluation. In 7 patients a PET-CT after treatment was performed. PET-CT showed at diagnosis 31 hypermetabolic focuses and MRI showed 47 lesions, with a good grade of concordance between both (k = 0.61; P = .005). In the response assessment, correlation between both techniques was good, and PET-CT was helpful in the appreciation of residual MRI lesions. Overall survival at 2 years of negative vs. positive PET-CT at the end of treatment was 100 vs. 37.5%, respectively (P = .045). PET-CT can be useful in the initial evaluation of PCNSL, and especially in the assessment of response. Despite the fact that PET-CT detects less small lesions than MRI, a good correlation between MRI and PET-CT was observed. It is effective in the evaluation of residual lesions. Prospective studies are needed to confirm their possible prognostic value. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  4. Variation in the human ribs geometrical properties and mechanical response based on X-ray computed tomography images resolution.

    Science.gov (United States)

    Perz, Rafał; Toczyski, Jacek; Subit, Damien

    2015-01-01

    Computational models of the human body are commonly used for injury prediction in automobile safety research. To create these models, the geometry of the human body is typically obtained from segmentation of medical images such as computed tomography (CT) images that have a resolution between 0.2 and 1mm/pixel. While the accuracy of the geometrical and structural information obtained from these images depend greatly on their resolution, the effect of image resolution on the estimation of the ribs geometrical properties has yet to be established. To do so, each of the thirty-four sections of ribs obtained from a Post Mortem Human Surrogate (PMHS) was imaged using three different CT modalities: standard clinical CT (clinCT), high resolution clinical CT (HRclinCT), and microCT. The images were processed to estimate the rib cross-section geometry and mechanical properties, and the results were compared to those obtained from the microCT images by computing the 'deviation factor', a metric that quantifies the relative difference between results obtained from clinCT and HRclinCT to those obtained from microCT. Overall, clinCT images gave a deviation greater than 100%, and were therefore deemed inadequate for the purpose of this study. HRclinCT overestimated the rib cross-sectional area by 7.6%, the moments of inertia by about 50%, and the cortical shell area by 40.2%, while underestimating the trabecular area by 14.7%. Next, a parametric analysis was performed to quantify how the variations in the estimate of the geometrical properties affected the rib predicted mechanical response under antero-posterior loading. A variation of up to 45% for the predicted peak force and up to 50% for the predicted stiffness was observed. These results provide a quantitative estimate of the sensitivity of the response of the FE model to the resolution of the images used to generate it. They also suggest that a correction factor could be derived from the comparison between microCT and

  5. Monkeys Wait to Begin a Computer Task when Waiting Makes Their Responses More Effective

    Directory of Open Access Journals (Sweden)

    Theodore A. Evans

    2014-02-01

    Full Text Available Rhesus monkeys (Macaca mulatta and capuchin monkeys (Cebus apella performed a computerized inhibitory control task modeled after an “escalating interest task” from a recent human study (Young, Webb, & Jacobs, 2011. In the original study, which utilized a first-person shooter game, human participants learned to inhibit firing their simulated weapon long enough for the weapon‟s damage potential to grow in effectiveness (up to 10 seconds in duration. In the present study, monkeys earned food pellets for eliminating arrays of target objects using a digital eraser. We assessed whether monkeys could suppress trial-initiating joystick movements long enough for the eraser to grow in size and speed, thereby making their eventual responses more effective. Monkeys of both species learned to inhibit moving the eraser for as long as 10 seconds, and they allowed the eraser to grow larger for successively larger target arrays. This study demonstrates an interesting parallel in behavioral inhibition between human and nonhuman participants and provides a method for future comparative testing of human and nonhuman test groups.

  6. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  7. Response functions for computing absorbed dose to skeletal tissues from photon irradiation-an update

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Perry B; Bahadori, Amir A [Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Eckerman, Keith F [Life Sciences Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Lee, Choonsik [Radiation Epidemiology Branch, National Cancer Institute, Bethesda, MD 20892 (United States); Bolch, Wesley E, E-mail: wbolch@ufl.edu [Nuclear and Radiological/Biomedical Engineering, University of Florida, Gainesville, FL 32611 (United States)

    2011-04-21

    A comprehensive set of photon fluence-to-dose response functions (DRFs) is presented for two radiosensitive skeletal tissues-active and total shallow marrow-within 15 and 32 bone sites, respectively, of the ICRP reference adult male. The functions were developed using fractional skeletal masses and associated electron-absorbed fractions as reported for the UF hybrid adult male phantom, which in turn is based upon micro-CT images of trabecular spongiosa taken from a 40 year male cadaver. The new DRFs expand upon both the original set of seven functions produced in 1985, and a 2007 update calculated under the assumption of secondary electron escape from spongiosa. In this study, it is assumed that photon irradiation of the skeleton will yield charged particle equilibrium across all spongiosa regions at energies exceeding 200 keV. Kerma coefficients for active marrow, inactive marrow, trabecular bone and spongiosa at higher energies are calculated using the DRF algorithm setting the electron-absorbed fraction for self-irradiation to unity. By comparing kerma coefficients and DRF functions, dose enhancement factors and mass energy-absorption coefficient (MEAC) ratios for active marrow to spongiosa were derived. These MEAC ratios compared well with those provided by the NIST Physical Reference Data Library (mean difference of 0.8%), and the dose enhancement factors for active marrow compared favorably with values calculated in the well-known study published by King and Spiers (1985 Br. J. Radiol. 58 345-56) (mean absolute difference of 1.9 percentage points). Additionally, dose enhancement factors for active marrow were shown to correlate well with the shallow marrow volume fraction (R{sup 2} = 0.91). Dose enhancement factors for the total shallow marrow were also calculated for 32 bone sites representing the first such derivation for this target tissue.

  8. Responsibility Towards The Customers Of Subscription-Based Software Solutions In The Context Of Using The Cloud Computing Technology

    Directory of Open Access Journals (Sweden)

    Bogdan Ștefan Ionescu

    2003-12-01

    Full Text Available The continuously transformation of the contemporary society and IT environment circumscribed its informational has led to the emergence of the cloud computing technology that provides the access to infrastructure and subscription-based software services, as well. In the context of a growing number of service providers with of cloud software, the paper aims to identify the perception of some current or potential users of the cloud solution, selected from among students enrolled in the accounting (professional or research master programs with the profile organized by the Bucharest University of Economic Studies, in terms of their expectations for cloud services, as well as the extent to which the SaaS providers are responsible for the provided services.

  9. Positron emission tomography-computed tomography standardized uptake values in clinical practice and assessing response to therapy.

    Science.gov (United States)

    Kinahan, Paul E; Fletcher, James W

    2010-12-01

    The use of standardized uptake values (SUVs) is now common place in clinical 2-deoxy-2-[(18)F] fluoro-D-glucose (FDG) position emission tomography-computed tomography oncology imaging and has a specific role in assessing patient response to cancer therapy. Ideally, the use of SUVs removes variability introduced by differences in patient size and the amount of injected FDG. However, in practice there are several sources of bias and variance that are introduced in the measurement of FDG uptake in tumors and also in the conversion of the image count data to SUVs. In this article the overall imaging process is reviewed and estimates of the magnitude of errors, where known, are given. Recommendations are provided for best practices in improving SUV accuracy. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. Computer program for post-flight evaluation of the control surface response for an attitude controlled missile

    Science.gov (United States)

    Knauber, R. N.

    1982-01-01

    A FORTRAN IV coded computer program is presented for post-flight analysis of a missile's control surface response. It includes preprocessing of digitized telemetry data for time lags, biases, non-linear calibration changes and filtering. Measurements include autopilot attitude rate and displacement gyro output and four control surface deflections. Simple first order lags are assumed for the pitch, yaw and roll axes of control. Each actuator is also assumed to be represented by a first order lag. Mixing of pitch, yaw and roll commands to four control surfaces is assumed. A pseudo-inverse technique is used to obtain the pitch, yaw and roll components from the four measured deflections. This program has been used for over 10 years on the NASA/SCOUT launch vehicle for post-flight analysis and was helpful in detecting incipient actuator stall due to excessive hinge moments. The program is currently set up for a CDC CYBER 175 computer system. It requires 34K words of memory and contains 675 cards. A sample problem presented herein including the optional plotting requires eleven (11) seconds of central processor time.

  11. Computation of restoration of ligand response in the random kinetics of a prostate cancer cell signaling pathway.

    Science.gov (United States)

    Dana, Saswati; Nakakuki, Takashi; Hatakeyama, Mariko; Kimura, Shuhei; Raha, Soumyendu

    2011-01-01

    Mutation and/or dysfunction of signaling proteins in the mitogen activated protein kinase (MAPK) signal transduction pathway are frequently observed in various kinds of human cancer. Consistent with this fact, in the present study, we experimentally observe that the epidermal growth factor (EGF) induced activation profile of MAP kinase signaling is not straightforward dose-dependent in the PC3 prostate cancer cells. To find out what parameters and reactions in the pathway are involved in this departure from the normal dose-dependency, a model-based pathway analysis is performed. The pathway is mathematically modeled with 28 rate equations yielding those many ordinary differential equations (ODE) with kinetic rate constants that have been reported to take random values in the existing literature. This has led to us treating the ODE model of the pathways kinetics as a random differential equations (RDE) system in which the parameters are random variables. We show that our RDE model captures the uncertainty in the kinetic rate constants as seen in the behavior of the experimental data and more importantly, upon simulation, exhibits the abnormal EGF dose-dependency of the activation profile of MAP kinase signaling in PC3 prostate cancer cells. The most likely set of values of the kinetic rate constants obtained from fitting the RDE model into the experimental data is then used in a direct transcription based dynamic optimization method for computing the changes needed in these kinetic rate constant values for the restoration of the normal EGF dose response. The last computation identifies the parameters, i.e., the kinetic rate constants in the RDE model, that are the most sensitive to the change in the EGF dose response behavior in the PC3 prostate cancer cells. The reactions in which these most sensitive parameters participate emerge as candidate drug targets on the signaling pathway. 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Contempt-LT: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Wheat, L.L.; Wagner, R.J.; Niederauer, G.F.; Obenchain, C.F.

    1975-06-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. CONTEMPT-LT can be used to model all current boiling water reactor pressure suppression systems, including containments with either vertical or horizontal vent systems. CONTEMPT-LT can also be used to model pressurized water reactor dry containments, subatmospheric containments, and dual volume containments with an annulus region, and can be used to describe containment responses in experimental containment systems. The program user defines which compartments are used, specifies input mass and energy additions, defines heat structure and leakage systems, and describes the time advancement and output control. CONTEMPT-LT source decks are available in double precision extended-binary-coded-decimal-interchange-code (EBCDIC) versions. Sample problems have been run on the IBM360/75 computer. (U.S.)

  13. Correlated responses in tissue weights measured in vivo by computer tomography in Dorset Down sheep selected for lean tissue growth

    International Nuclear Information System (INIS)

    Nsoso, S.J.; Young, M.J.; Beatson, P.R.

    2003-01-01

    The aim of this study was to estimate correlated responses in lean, fat and bone weights in vivo in Dorset Down sheep selected for lean tissue growth. Over the period 1986-1992 inclusive, the lean tissue growth line had been selected using two economic indices for an increased aggregate breeding value incorporating predicted lean and fat weights with positive and negative economic weightings, respectively. The control line was selected for no change in lean tissue growth each year. Animals were born and run on pasture all year round. X-ray computer tomography was used to estimate the weights of lean, fat and bone in vivo in the 1994-born sheep, aged 265-274 days and selected randomly into 12 rams and 12 ewes from the selected line and 10 rams and 9 ewes from the control line. The lean tissue growth line had significantly greater responses in lean weight (+0.65 + 0.10 kg) and lean percentage (+1.19 + 0.17%) and significantly lesser fat weight (-0.36 + 0.08 kg) and fat percentage (-1.88 + 0.20%) compared to the control line. There was a significant increase in bone weight (+0.27 + 0.03 kg) and bone percentage (+0.69 + 0.09%) in the lean tissue growth line compared to the control line. Responses differed significantly between sexes of the lean tissue growth line, rams having a greater response in weight of lean (+1.22 + 0.20 vs. +0.08 + 0.22 kg) and bone (+0.45 + 0.06 vs. +0.09 + 0.07 kg), and a lesser response in weight of fat (-0.03 + 0.15 vs. -0.70 + 0.16 kg) than the ewes. Selection led to significant changes in lean (increase) and fat weights (decrease), and bone weight increased. Although responses in the lean tissue growth line differed significantly between sexes, there were confounding factors due to differences in management and lack of comparison at equal stage of development. Therefore, to assess real genetic differences further studies should be conducted taking these factors into consideration

  14. Abnormal response to mental stress in patients with Takotsubo cardiomyopathy detected by gated single photon emission computed tomography

    International Nuclear Information System (INIS)

    Sciagra, Roberto; Genovese, Sabrina; Pupi, Alberto; Parodi, Guido; Bellandi, Benedetta; Antoniucci, David; Del Pace, Stefano; Zampini, Linda; Gensini, Gian Franco

    2010-01-01

    Persistent abnormalities are usually not detected in patients with Takotsubo cardiomyopathy (TTC). Since sympathetically mediated myocardial damage has been proposed as a causative mechanism of TTC, we explored whether mental stress could evoke abnormalities in these patients. One month after an acute event, 22 patients fulfilling all TTC diagnostic criteria and 11 controls underwent resting and mental stress gated single photon emission computed tomography (SPECT). Perfusion, wall motion, transient ischaemic dilation (TID) and left ventricular (LV) ejection fraction (EF) were evaluated. None of the controls showed stress-induced abnormalities. Mental stress evoked regional changes (perfusion defects and/or wall motion abnormality) in 16 TTC subjects and global abnormalities (LVEF fall >5% and/or TID >1.10) in 13; 3 had a completely negative response. TID, delta LVEF and delta wall motion score were significantly different in TTC vs control patients: 1.08 ± 0.20 vs 0.95 ± 0.11 (p < 0.05), -1.7 ± 6% vs 4 ± 5% (p < 0.02) and 2.5 (0, 4.25) vs 0 (0, 0) (p < 0.002), respectively. Mental stress may evoke regional and/or global abnormalities in most TTC patients. The abnormal response to mental stress supports the role of sympathetic stimulation in TTC. Mental stress could thus be helpful for TTC evaluation. (orig.)

  15. X-Ray Computed Tomography Reveals the Response of Root System Architecture to Soil Texture1[OPEN

    Science.gov (United States)

    Rogers, Eric D.; Monaenkova, Daria; Mijar, Medhavinee; Goldman, Daniel I.

    2016-01-01

    Root system architecture (RSA) impacts plant fitness and crop yield by facilitating efficient nutrient and water uptake from the soil. A better understanding of the effects of soil on RSA could improve crop productivity by matching roots to their soil environment. We used x-ray computed tomography to perform a detailed three-dimensional quantification of changes in rice (Oryza sativa) RSA in response to the physical properties of a granular substrate. We characterized the RSA of eight rice cultivars in five different growth substrates and determined that RSA is the result of interactions between genotype and growth environment. We identified cultivar-specific changes in RSA in response to changing growth substrate texture. The cultivar Azucena exhibited low RSA plasticity in all growth substrates, whereas cultivar Bala root depth was a function of soil hardness. Our imaging techniques provide a framework to study RSA in different growth environments, the results of which can be used to improve root traits with agronomic potential. PMID:27208237

  16. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    Science.gov (United States)

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  17. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients

    DEFF Research Database (Denmark)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine

    2015-01-01

    BACKGROUND AND PURPOSE: This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. METHODS...... AND MATERIALS: Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT...

  18. Music and natural sounds in an auditory steady-state response based brain-computer interface to increase user acceptance.

    Science.gov (United States)

    Heo, Jeong; Baek, Hyun Jae; Hong, Seunghyeok; Chang, Min Hye; Lee, Jeong Su; Park, Kwang Suk

    2017-05-01

    Patients with total locked-in syndrome are conscious; however, they cannot express themselves because most of their voluntary muscles are paralyzed, and many of these patients have lost their eyesight. To improve the quality of life of these patients, there is an increasing need for communication-supporting technologies that leverage the remaining senses of the patient along with physiological signals. The auditory steady-state response (ASSR) is an electro-physiologic response to auditory stimulation that is amplitude-modulated by a specific frequency. By leveraging the phenomenon whereby ASSR is modulated by mind concentration, a brain-computer interface paradigm was proposed to classify the selective attention of the patient. In this paper, we propose an auditory stimulation method to minimize auditory stress by replacing the monotone carrier with familiar music and natural sounds for an ergonomic system. Piano and violin instrumentals were employed in the music sessions; the sounds of water streaming and cicadas singing were used in the natural sound sessions. Six healthy subjects participated in the experiment. Electroencephalograms were recorded using four electrodes (Cz, Oz, T7 and T8). Seven sessions were performed using different stimuli. The spectral power at 38 and 42Hz and their ratio for each electrode were extracted as features. Linear discriminant analysis was utilized to classify the selections for each subject. In offline analysis, the average classification accuracies with a modulation index of 1.0 were 89.67% and 87.67% using music and natural sounds, respectively. In online experiments, the average classification accuracies were 88.3% and 80.0% using music and natural sounds, respectively. Using the proposed method, we obtained significantly higher user-acceptance scores, while maintaining a high average classification accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Waiting is the hardest part: comparison of two computational strategies for performing a compelled-response task

    Directory of Open Access Journals (Sweden)

    Emilio Salinas

    2010-12-01

    Full Text Available The neural basis of choice behavior is commonly investigated with tasks in which a subject analyzes a stimulus and reports his or her perceptual experience with an appropriate motor action. We recently developed a novel task, the compelled-saccade task, with which the influence of the sensory information on the subject's choice can be tracked through time with millisecond resolution, thus providing a new tool for correlating neuronal activity and behavior. This paradigm has a crucial feature: the signal that instructs the subject to make an eye movement is given before the cue that indicates which of two possible choices is the correct one. Previously, we found that psychophysical performance in this task could be accurately replicated by a model in which two developing oculomotor plans race to a threshold and the incoming perceptual information differentially accelerates their trajectories toward it. However, the task design suggests an alternative mechanism: instead of modifying an ongoing oculomotor plan on the fly as the sensory information becomes available, the subject could try to wait, withholding the oculomotor response until the sensory cue is revealed. Here, we use computer simulations to explore and compare the performance of these two types of models. We find that both reproduce the main features of the psychophysical data in the compelled-saccade task, but they give rise to distinct behavioral and neurophysiological predictions. Although, superficially, the waiting model is intuitively appealing, it is ultimately inconsistent with experimental results from this and other tasks.

  20. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  1. Response

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Neuromorphic silicon chips have been developed over the last 30 years, inspired by the design of biological nervous systems and offering an alternative paradigm for computation, with real-time massively parallel operation and potentially large power savings with respect to conventional computing architectures. I will present the general principles with a brief investigation of the design choices that have been explored, and I'll discuss how such hardware has been applied to problems such as classification.

  2. Evaluating a Computer Flash-Card Sight-Word Recognition Intervention with Self-Determined Response Intervals in Elementary Students with Intellectual Disability

    Science.gov (United States)

    Cazzell, Samantha; Skinner, Christopher H.; Ciancio, Dennis; Aspiranti, Kathleen; Watson, Tiffany; Taylor, Kala; McCurdy, Merilee; Skinner, Amy

    2017-01-01

    A concurrent multiple-baseline across-tasks design was used to evaluate the effectiveness of a computer flash-card sight-word recognition intervention with elementary-school students with intellectual disability. This intervention allowed the participants to self-determine each response interval and resulted in both participants acquiring…

  3. Computational Modeling of Hypothalamic-Pituitary-Gonadal Axis to Predict Adaptive Responses in Female Fathead Minnows Exposed to an Aromatase Inhibitor

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose response and time-course...

  4. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    Science.gov (United States)

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is…

  5. Early Assessment of Treatment Responses During Radiation Therapy for Lung Cancer Using Quantitative Analysis of Daily Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Paul, Jijo; Yang, Cungeng [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Wu, Hui [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); The Affiliated Cancer Hospital of Zhengzhou University, Zhengzhou (China); Tai, An [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dalah, Entesar [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Department of Medical Diagnostic Imaging, College of Health Science, University of Sharjah (United Arab Emirates); Zheng, Cheng [Biostatistics, Joseph. J. Zilber School of Public Health, University of Wisconsin-Milwaukee, Milwaukee, Wisconsin (United States); Johnstone, Candice [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Kong, Feng-Ming [Department of Radiation Oncology, Indiana University, Indianapolis, Indiana (United States); Gore, Elizabeth [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Li, X. Allen, E-mail: ali@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States)

    2017-06-01

    Purpose: To investigate early tumor and normal tissue responses during the course of radiation therapy (RT) for lung cancer using quantitative analysis of daily computed tomography (CT) scans. Methods and Materials: Daily diagnostic-quality CT scans acquired using CT-on-rails during CT-guided RT for 20 lung cancer patients were quantitatively analyzed. On each daily CT set, the contours of the gross tumor volume (GTV) and lungs were generated and the radiation dose delivered was reconstructed. The changes in CT image intensity (Hounsfield unit [HU]) features in the GTV and the multiple normal lung tissue shells around the GTV were extracted from the daily CT scans. The associations between the changes in the mean HUs, GTV, accumulated dose during RT delivery, and patient survival rate were analyzed. Results: During the RT course, radiation can induce substantial changes in the HU histogram features on the daily CT scans, with reductions in the GTV mean HUs (dH) observed in the range of 11 to 48 HU (median 30). The dH is statistically related to the accumulated GTV dose (R{sup 2} > 0.99) and correlates weakly with the change in GTV (R{sup 2} = 0.3481). Statistically significant increases in patient survival rates (P=.038) were observed for patients with a higher dH in the GTV. In the normal lung, the 4 regions proximal to the GTV showed statistically significant (P<.001) HU reductions from the first to last fraction. Conclusion: Quantitative analysis of the daily CT scans indicated that the mean HUs in lung tumor and surrounding normal tissue were reduced during RT delivery. This reduction was observed in the early phase of the treatment, is patient specific, and correlated with the delivered dose. A larger HU reduction in the GTV correlated significantly with greater patient survival. The changes in daily CT features, such as the mean HU, can be used for early assessment of the radiation response during RT delivery for lung cancer.

  6. Collateral circulation on perfusion-computed tomography-source images predicts the response to stroke intravenous thrombolysis.

    Science.gov (United States)

    Calleja, A I; Cortijo, E; García-Bermejo, P; Gómez, R D; Pérez-Fernández, S; Del Monte, J M; Muñoz, M F; Fernández-Herranz, R; Arenillas, J F

    2013-05-01

    Perfusion-computed tomography-source images (PCT-SI) may allow a dynamic assessment of leptomeningeal collateral arteries (LMC) filling and emptying in middle cerebral artery (MCA) ischaemic stroke. We described a regional LMC scale on PCT-SI and hypothesized that a higher collateral score would predict a better response to intravenous (iv) thrombolysis. We studied consecutive ischaemic stroke patients with an acute MCA occlusion documented by transcranial Doppler/transcranial color-coded duplex, treated with iv thrombolysis who underwent PCT prior to treatment. Readers evaluated PCT-SI in a blinded fashion to assess LMC within the hypoperfused MCA territory. LMC scored as follows: 0, absence of vessels; 1, collateral supply filling ≤ 50%; 2, between> 50% and < 100%; 3, equal or more prominent when compared with the unaffected hemisphere. The scale was divided into good (scores 2-3) vs. poor (scores 0-1) collaterals. The predetermined primary end-point was a good 3-month functional outcome, while early neurological recovery, transcranial duplex-assessed 24-h MCA recanalization, 24-h hypodensity volume and hemorrhagic transformation were considered secondary end-points. Fifty-four patients were included (55.5% women, median NIHSS 10), and 4-13-23-14 patients had LMC score (LMCs) of 0-1-2-3, respectively. The probability of a good long-term outcome augmented gradually with increasing LMCs: (0) 0%; (1) 15.4%; (2) 65.2%; (3) 64.3%, P = 0.004. Good-LMCs was independently associated with a good outcome [OR 21.02 (95% CI 2.23-197.75), P = 0.008]. Patients with good LMCs had better early neurological recovery (P = 0.001), smaller hypodensity volumes (P < 0.001) and a clear trend towards a higher recanalization rate. A higher degree of LMC assessed by PCT-SI predicts good response to iv thrombolysis in MCA ischaemic stroke patients. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  7. Affect and the computer game player: the effect of gender, personality, and game reinforcement structure on affective responses to computer game-play.

    Science.gov (United States)

    Chumbley, Justin; Griffiths, Mark

    2006-06-01

    Previous research on computer games has tended to concentrate on their more negative effects (e.g., addiction, increased aggression). This study departs from the traditional clinical and social learning explanations for these behavioral phenomena and examines the effect of personality, in-game reinforcement characteristics, gender, and skill on the emotional state of the game-player. Results demonstrated that in-game reinforcement characteristics and skill significantly effect a number of affective measures (most notably excitement and frustration). The implications of the impact of game-play on affect are discussed with reference to the concepts of "addiction" and "aggression."

  8. Response

    Science.gov (United States)

    Higgins, Chris

    2012-01-01

    This article presents the author's response to the reviews of his book, "The Good Life of Teaching: An Ethics of Professional Practice." He begins by highlighting some of the main concerns of his book. He then offers a brief response, doing his best to address the main criticisms of his argument and noting where the four reviewers (Charlene…

  9. Calculating buoy response for a wave energy converter—A comparison of two computational methods and experimental results

    Directory of Open Access Journals (Sweden)

    Linnea Sjökvist

    2017-05-01

    Full Text Available When designing a wave power plant, reliable and fast simulation tools are required. Computational fluid dynamics (CFD software provides high accuracy but with a very high computational cost, and in operational, moderate sea states, linear potential flow theories may be sufficient to model the hydrodynamics. In this paper, a model is built in COMSOL Multiphysics to solve for the hydrodynamic parameters of a point-absorbing wave energy device. The results are compared with a linear model where the hydrodynamical parameters are computed using WAMIT, and to experimental results from the Lysekil research site. The agreement with experimental data is good for both numerical models.

  10. Any realistic theory must be computationally realistic: a response to N. Gisin's definition of a Realistic Physics Theory

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    It is argued that the recent definition of a realistic physics theory by N. Gisin cannot be considered comprehensive unless it is supplemented with requirement that any realistic theory must be computationally realistic as well.

  11. Response to Dr. Smith's Comments and Criticisms Concerning "Identification of Student Misconceptions in Genetics Problem Solving via Computer Program."

    Science.gov (United States)

    Browning, Mark; Lehman, James D.

    1991-01-01

    Authors respond to criticisms by Smith in the same issue and defend their use of the term "gene" and "misconception." Authors indicate that they did not believe that the use of computers significantly skewed their data concerning student errors. (PR)

  12. MACKLIB-IV: a library of nuclear response functions generated with the MACK-IV computer program from ENDF/B-IV

    International Nuclear Information System (INIS)

    Gohar, Y.; Abdou, M.A.

    1978-03-01

    MACKLIB-IV employs the CTR energy group structure of 171 neutron groups and 36 gamma groups. A retrieval computer program is included with the library to permit collapsing into any other energy group structure. The library is in the new format of the ''MACK-Activity Table'' which uses a fixed position for each specific response function. This permits the user when employing the library with present transport codes to obtain directly the nuclear responses (e.g. the total nuclear heating) summed for all isotopes and integrated over any geometrical volume. The response functions included in the library are neutron kerma factor, gamma kerma factor, gas production and tritium-breeding functions, and all important reaction cross sections. Pertinent information about the library and a graphical display of six response functions for all materials in the library are given

  13. Texture analysis of advanced non-small cell lung cancer (NSCLC) on contrast-enhanced computed tomography: prediction of the response to the first-line chemotherapy

    International Nuclear Information System (INIS)

    Farina, Davide; Morassi, Mauro; Maroldi, Roberto; Roca, Elisa; Tassi, Gianfranco; Cavalleri, Giuseppe

    2013-01-01

    To assess whether tumour heterogeneity, quantified by texture analysis (TA) on contrast-enhanced computed tomography (CECT), can predict response to chemotherapy in advanced non-small cell lung cancer (NSCLC). Fifty-three CECT studies of patients with advanced NSCLC who had undergone first-line chemotherapy were retrospectively reviewed. Response to chemotherapy was evaluated according to RECIST1.1. Tumour uniformity was assessed by a TA method based on Laplacian of Gaussian filtering. The resulting parameters were correlated with treatment response and overall survival by multivariate analysis. Thirty-one out of 53 patients were non-responders and 22 were responders. Average overall survival was 13 months (4-35), minimum follow-up was 12 months. In the adenocarcinoma group (n = 31), the product of tumour uniformity and grey level (GL*U) was the unique independent variable correlating with treatment response. Dividing the GL*U (range 8.5-46.6) into tertiles, lesions belonging to the second and the third tertiles had an 8.3-fold higher probability of treatment response compared with those in the first tertile. No association between texture features and response to treatment was observed in the non-adenocarcinoma group (n = 22). GL*U did not correlate with overall survival. TA on CECT images in advanced lung adenocarcinoma provides an independent predictive indicator of response to first-line chemotherapy. (orig.)

  14. A BENCHMARK PROGRAM FOR EVALUATION OF METHODS FOR COMPUTING SEISMIC RESPONSE OF COUPLED BUILDING-PIPING/EQUIPMENT WITH NON-CLASSICAL DAMPING

    International Nuclear Information System (INIS)

    Xu, J.; Degrassi, G.; Chokshi, N.

    2001-01-01

    Under the auspices of the US Nuclear Regulatory Commission (NRC), Brookhaven National Laboratory (BNL) developed a comprehensive program to evaluate state-of-the-art methods and computer programs for seismic analysis of typical coupled nuclear power plant (NPP) systems with nonclassical damping. In this program, four benchmark models of coupled building-piping/equipment systems with different damping characteristics were analyzed for a suite of earthquakes by program participants applying their uniquely developed methods and computer programs. This paper presents the results of their analyses, and their comparison to the benchmark solutions generated by BNL using time domain direct integration methods. The participant's analysis results established using complex modal time history methods showed good comparison with the BNL solutions, while the analyses produced with either complex-mode response spectrum methods or classical normal-mode response spectrum method, in general, produced more conservative results, when averaged over a suite of earthquakes. However, when coupling due to damping is significant, complex-mode response spectrum methods performed better than the classical normal-mode response spectrum method. Furthermore, as part of the program objectives, a parametric assessment is also presented in this paper, aimed at evaluation of the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled NPP systems. It is believed that the findings and insights learned from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems

  15. Collecting Sensitive Self-Report Data with Laptop Computers: Impact on the Response Tendencies of Adolescents in a Home Interview.

    Science.gov (United States)

    Supple, Andrew J.; Aquilino, William S.; Wright, Debra L.

    1999-01-01

    Explored effects of computerized, self-administered data collection techniques in research on adolescents' self-reported substance use and psychological well-being. Adolescents completing sensitive questions on only laptop computers reported higher levels of substance use and indicated higher levels of depression and irritability; they perceived…

  16. Do Interviewers' Health Beliefs and Habits Modify Responses to Sensitive Questions? A study using Data Collected from Pregnant women by Means of Computer-assisted Telephone Interviews

    DEFF Research Database (Denmark)

    Andersen, Anne-Marie Nybo; Olsen, Jørn

    2002-01-01

    If interviewers' personal habits or attitudes influence respondents' answers to given questions, this may lead to bias, which should be taken into consideration when analyzing data. The authors examined a potential interviewer effect in a study of pregnant women in which exposure data were obtained...... through computer-assisted telephone interviews. The authors compared interviewer characteristics for 34 interviewers with the responses they obtained in 12,910 interviews carried out for the Danish National Birth Cohort Study. Response data on smoking and alcohol consumption in the first trimester...... of pregnancy were collected during the time period October 1, 1997-February 1, 1999. Overall, the authors found little evidence to suggest that interviewers' personal habits or attitudes toward smoking and alcohol consumption during pregnancy had consequences for the responses they obtained; neither did...

  17. ARAC: a centralized computer-assisted emergency planning, response, and assessment system for atmospheric releases of toxic material

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Knox, J.B.

    1987-01-01

    The Atmospheric Release Advisory Capability (ARAC) is an emergency planning, response, and assessment service, developed by the US Departments of Energy and Defense, and focused, thus far, on atmospheric releases of nuclear material. For the past 14 years ARAC has responded to over 150 accidents, potential accidents, and major exercises. The most notable accident responses are the COSMOS 954 reentry, the Three Mile Island (TMI-2) accident and subsequent purge of 85 Kr from the containment vessel, the recent UF 6 accident at the Kerr-McGee Plant, Gore, Oklahoma, and the Chernobyl nuclear reactor accident in the Soviet Union. Based on experience in the area of emergency response, developed during the past 14 years, this paper describes the cost effectiveness and other advantages of a centralized emergency planning, response, and assessment service for atmospheric releases of nuclear material

  18. ARAC: a centralized computer assisted emergency planning, response, and assessment system for atmospheric releases of toxic material

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Knox, J.B.

    1986-10-01

    The Atmospheric Release Advisory Capability (ARAC) is an emergency planning, response, and assessment service, developed by the US Departments of Energy and Defense, and focused, thus far, on atmospheric releases of nuclear material. For the past 14 years ARAC has responded to over 150 accidents, potential accidents, and major exercises. The most notable accident responses are the COSMOS 954 reentry, the Three Mile Island (TMI-2) accident and subsequent purge of 85 Kr from the containment vessel, the recent UF 6 accident at the Kerr-McGee Plant, Gore, Oklahoma, and the Chernobyl nuclear reactor accident in the Soviet Union. Based on experience in the area of emergency response, developed during the past 14 years, this paper describes the cost effectiveness and other advantages of a centralized emergency planning, response, and assessment service for atmospheric releases of nuclear material

  19. Elaboration of a computer code for the solution of a two-dimensional two-energy group diffusion problem using the matrix response method

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.

    1980-12-01

    An analytical procedure to solve the neutron diffusion equation in two dimensions and two energy groups was developed. The response matrix method was used coupled with an expansion of the neutron flux in finite Fourier series. A computer code 'MRF2D' was elaborated to implement the above mentioned procedure for PWR reactor core calculations. Different core symmetry options are allowed by the code, which is also flexible enough to allow for improvements by means of algorithm optimization. The code performance was compared with a corner mesh finite difference code named TVEDIM by using a International Atomic Energy Agency (IAEA) standard problem. Computer processing time 12,7% smaller is required by the MRF2D code to reach the same precision on criticality eigenvalue. (Author) [pt

  20. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients.

    Science.gov (United States)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten

    2015-10-01

    This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Step responses of a torsional system with multiple clearances: Study of vibro-impact phenomenon using experimental and computational methods

    Science.gov (United States)

    Oruganti, Pradeep Sharma; Krak, Michael D.; Singh, Rajendra

    2018-01-01

    Recently Krak and Singh (2017) proposed a scientific experiment that examined vibro-impacts in a torsional system under a step down excitation and provided preliminary measurements and limited non-linear model studies. A major goal of this article is to extend the prior work with a focus on the examination of vibro-impact phenomena observed under step responses in a torsional system with one, two or three controlled clearances. First, new measurements are made at several locations with a higher sampling frequency. Measured angular accelerations are examined in both time and time-frequency domains. Minimal order non-linear models of the experiment are successfully constructed, using piecewise linear stiffness and Coulomb friction elements; eight cases of the generic system are examined though only three are experimentally studied. Measured and predicted responses for single and dual clearance configurations exhibit double sided impacts and time varying periods suggest softening trends under the step down torque. Non-linear models are experimentally validated by comparing results with new measurements and with those previously reported. Several metrics are utilized to quantify and compare the measured and predicted responses (including peak to peak accelerations). Eigensolutions and step responses of the corresponding linearized models are utilized to better understand the nature of the non-linear dynamic system. Finally, the effect of step amplitude on the non-linear responses is examined for several configurations, and hardening trends are observed in the torsional system with three clearances.

  2. Computer-Mediated Communication in Intimate Relationships: Associations of Boundary Crossing, Intrusion, Relationship Satisfaction, and Partner Responsiveness.

    Science.gov (United States)

    Norton, Aaron M; Baptist, Joyce; Hogan, Bernie

    2018-01-01

    This study examined the impact of technology on couples in committed relationships through the lens of the couple and technology framework. Specifically, we used data from 6,756 European couples to examine associations between online boundary crossing, online intrusion, relationship satisfaction, and partner responsiveness. The results suggest that participants' reports of online boundary crossing were linked with lower relationship satisfaction and partner responsiveness. Also, lower relationship satisfaction and partner responsiveness were associated with increased online boundary crossing. The results suggest that men, but not women, who reported greater acceptability for online boundary crossing were more likely to have partners who reported lower relationship satisfaction in their relationships. Implications for clinicians, relationship educators, and researchers are discussed. © 2017 American Association for Marriage and Family Therapy.

  3. Computational prediction and experimental verification of HVA1-like abscisic acid responsive promoters in rice (Oryza sativa).

    Science.gov (United States)

    Ross, Christian; Shen, Qingxi J

    2006-09-01

    Abscisic acid (ABA) is one of the central plant hormones, responsible for controlling both maturation and germination in seeds, as well as mediating adaptive responses to desiccation, injury, and pathogen infection in vegetative tissues. Thorough analyses of two barley genes, HVA1 and HVA22, indicate that their response to ABA relies on the interaction of two cis-acting elements in their promoters, an ABA response element (ABRE) and a coupling element (CE). Together, they form an ABA response promoter complex (ABRC). Comparison of promoters of barley HVA1 and it rice orthologue indicates that the structures and sequences of their ABRCs are highly similar. Prediction of ABA responsive genes in the rice genome is then tractable to a bioinformatics approach based on the structures of the well-defined barley ABRCs. Here we describe a model developed based on the consensus, inter-element spacing and orientations of experimentally determined ABREs and CEs. Our search of the rice promoter database for promoters that fit the model has generated a partial list of genes in rice that have a high likelihood of being involved in the ABA signaling network. The ABA inducibility of some of the rice genes identified was validated with quantitative reverse transcription PCR (QPCR). By limiting our input data to known enhancer modules and experimentally derived rules, we have generated a high confidence subset of ABA-regulated genes. The results suggest that the pathways by which cereals respond to biotic and abiotic stresses overlap significantly, and that regulation is not confined to the level transcription. The large fraction of putative regulatory genes carrying HVA1-like enhancer modules in their promoters suggests the ABA signal enters at multiple points into a complex regulatory network that remains largely unmapped.

  4. Developing Predictive Approaches to Characterize Adaptive Responses of the Reproductive Endocrine Axis to Aromatase Inhibition: Computational Modeling

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC)...

  5. Predicting Adaptive Response to Fadrozole Exposure: Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (...

  6. Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation

    NARCIS (Netherlands)

    Klinkenberg, S.; Straatemeier, M.; van der Maas, H.L.J.

    2011-01-01

    In this paper we present a model for computerized adaptive practice and monitoring. This model is used in the Maths Garden, a web-based monitoring system, which includes a challenging web environment for children to practice arithmetic. Using a new item response model based on the Elo (1978) rating

  7. Numerical Differentiation Methods for Computing Error Covariance Matrices in Item Response Theory Modeling: An Evaluation and a New Proposal

    Science.gov (United States)

    Tian, Wei; Cai, Li; Thissen, David; Xin, Tao

    2013-01-01

    In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…

  8. Simplified response monitoring criteria for multiple myeloma in patients undergoing therapy with novel agents using computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Schabel, Christoph; Horger, Marius; Kum, Sara [Department of Diagnostic and Interventional Radiology, Eberhard-Karls-University Tuebingen, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Weisel, Katja [Department of Internal Medicine II – Hematology & Oncology, Eberhard-Karls-University Tuebingen, Otfried-Müller-Str. 5, 72076 Tuebingen (Germany); Fritz, Jan [Russell H. Morgan Department of Radiology and Radiological Science, The Johns Hopkins University School of Medicine, 600 N Wolfe St., Baltimore, MD 21287 (United States); Ioanoviciu, Sorin D. [Department of Internal Medicine, Clinical Municipal Hospital Timisoara, Gheorghe Dima Str. 5, 300079 Timisoara (Romania); Bier, Georg, E-mail: georg.bier@med.uni-tuebingen.de [Department of Neuroradiology, Eberhard-Karls-University Tuebingen, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2016-12-15

    Highlights: • A simplified method for response monitoring of multiple myeloma is proposed. • Medullary bone lesions of all limbs were included and analysed. • Diameters of ≥2 medullary bone lesions are sufficient for therapy monitoring. - Abstract: Introduction: Multiple myeloma is a malignant hematological disorder of the mature B-cell lymphocytes originating in the bone marrow. While therapy monitoring is still mainly based on laboratory biomarkers, the additional use of imaging has been advocated due to inaccuracies of serological biomarkers or in a-secretory myelomas. Non-enhanced CT and MRI have similar sensitivities for lesions in yellow marrow-rich bone marrow cavities with a favourable risk and cost-effectiveness profile of CT. Nevertheless, these methods are still limited by frequently high numbers of medullary lesions and its time consumption for proper evaluation. Objective: To establish simplified response criteria by correlating size and CT attenuation changes of medullary multiple myeloma lesions in the appendicular skeleton with the course of lytic bone lesions in the entire skeleton. Furthermore to evaluate these criteria with respect to established hematological myeloma-specific parameters for the prediction of treatment response to bortezomib or lenalidomide. Materials and methods: Non-enhanced reduced-dose whole-body CT examinations of 78 consecutive patients (43 male, 35 female, mean age 63.69 ± 9.2 years) with stage III multiple myeloma were retrospectively re-evaluated. On per patient basis, size and mean CT attenuation of 2–4 representative lesions in the limbs were measured at baseline and at a follow-up after a mean of 8 months. Results were compared with the course of lytical bone lesions as well with that of specific hematological biomarkers. Myeloma response was assessed according to the International Myeloma Working Group (IMWG) uniform response criteria. Testing for correlation between response of medullary lesions (Resp

  9. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  10. Dynamic Response of the Skull with Sinuses under Blunt Frontal Impact: A Three-Dimensional Computational Study

    Directory of Open Access Journals (Sweden)

    Xuewei Song

    2015-01-01

    Full Text Available The objective of this study is to analyze the biomechanical effects of sinuses in the skull on the facial impact response. Two models were built, where one had sinuses and the other had none. The models were verified using cadaver test data, including impacts to frontal bone, zygomatic bone, and maxillae. In the maxilla and zygoma impact, sinuses were found to have no significant effect on the global distribution of stress or stiffness of facial bones, and the influence was limited in local area. In forehead impact, the sinuses significantly affected the distribution of stress and strain in the skull due to its location in facial bones. The result shows that if the sinus is far away from the location of impact, its effect on the overall response of skull could be ignored. In addition, the distance between the region of interest and sinuses is another important parameter when studying the local effect of sinuses.

  11. New advances in the forced response computation of periodic structures using the wave finite element (WFE) method

    OpenAIRE

    Mencik , Jean-Mathieu

    2014-01-01

    International audience; The wave finite element (WFE) method is investigated to describe the harmonic forced response of onedimensional periodic structures like those composed of complex substructures and encountered in engineering applications. The dynamic behavior of these periodic structures is analyzed over wide frequency bands where complex spatial dynamics, inside the substructures, are likely to occur.Within theWFE framework, the dynamic behavior of periodic structures is described in ...

  12. Genetic and Computational Approaches for Studying Plant Development and Abiotic Stress Responses Using Image-Based Phenotyping

    Science.gov (United States)

    Campbell, M. T.; Walia, H.; Grondin, A.; Knecht, A.

    2017-12-01

    The development of abiotic stress tolerant crops (i.e. drought, salinity, or heat stress) requires the discovery of DNA sequence variants associated with stress tolerance-related traits. However, many traits underlying adaptation to abiotic stress involve a suite of physiological pathways that may be induced at different times throughout the duration of stress. Conventional single-point phenotyping approaches fail to fully capture these temporal responses, and thus downstream genetic analysis may only identify a subset of the genetic variants that are important for adaptation to sub-optimal environments. Although genomic resources for crops have advanced tremendously, the collection of phenotypic data for morphological and physiological traits is laborious and remains a significant bottleneck in bridging the phenotype-genotype gap. In recent years, the availability of automated, image-based phenotyping platforms has provided researchers with an opportunity to collect morphological and physiological traits non-destructively in a highly controlled environment. Moreover, these platforms allow abiotic stress responses to be recorded throughout the duration of the experiment, and have facilitated the use of function-valued traits for genetic analyses in major crops. We will present our approaches for addressing abiotic stress tolerance in cereals. This talk will focus on novel open-source software to process and extract biological meaningful data from images generated from these phenomics platforms. In addition, we will discuss the statistical approaches to model longitudinal phenotypes and dissect the genetic basis of dynamic responses to these abiotic stresses throughout development.

  13. A second-generation computational modeling of cardiac electrophysiology: response of action potential to ionic concentration changes and metabolic inhibition.

    Science.gov (United States)

    Alaa, Nour Eddine; Lefraich, Hamid; El Malki, Imane

    2014-10-21

    Cardiac arrhythmias are becoming one of the major health care problem in the world, causing numerous serious disease conditions including stroke and sudden cardiac death. Furthermore, cardiac arrhythmias are intimately related to the signaling ability of cardiac cells, and are caused by signaling defects. Consequently, modeling the electrical activity of the heart, and the complex signaling models that subtend dangerous arrhythmias such as tachycardia and fibrillation, necessitates a quantitative model of action potential (AP) propagation. Yet, many electrophysiological models, which accurately reproduce dynamical characteristic of the action potential in cells, have been introduced. However, these models are very complex and are very time consuming computationally. Consequently, a large amount of research is consecrated to design models with less computational complexity. This paper is presenting a new model for analyzing the propagation of ionic concentrations and electrical potential in space and time. In this model, the transport of ions is governed by Nernst-Planck flux equation (NP), and the electrical interaction of the species is described by a new cable equation. These set of equations form a system of coupled partial nonlinear differential equations that is solved numerically. In the first we describe the mathematical model. To realize the numerical simulation of our model, we proceed by a finite element discretization and then we choose an appropriate resolution algorithm. We give numerical simulations obtained for different input scenarios in the case of suicide substrate reaction which were compared to those obtained in literature. These input scenarios have been chosen so as to provide an intuitive understanding of dynamics of the model. By accessing time and space domains, it is shown that interpreting the electrical potential of cell membrane at steady state is incorrect. This model is general and applies to ions of any charge in space and time

  14. How do trees grow? Response from the graphical and quantitative analyses of computed tomography scanning data collected on stem sections.

    Science.gov (United States)

    Dutilleul, Pierre; Han, Li Wen; Beaulieu, Jean

    2014-06-01

    Tree growth, as measured via the width of annual rings, is used for environmental impact assessment and climate back-forecasting. This fascinating natural process has been studied at various scales in the stem (from cell and fiber within a growth ring, to ring and entire stem) in one, two, and three dimensions. A new approach is presented to study tree growth in 3D from stem sections, at a scale sufficiently small to allow the delineation of reliable limits for annual rings and large enough to capture directional variation in growth rates. The technology applied is computed tomography scanning, which provides - for one stem section - millions of data (indirect measures of wood density) that can be mapped, together with a companion measure of dispersion and growth ring limits in filigree. Graphical and quantitative analyses are reported for white spruce trees with circular vs non-circular growth. Implications for dendroclimatological research are discussed. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  15. Warfarin resistance associated with genetic polymorphism of VKORC1: linking clinical response to molecular mechanism using computational modeling.

    Science.gov (United States)

    Lewis, Benjamin C; Nair, Pramod C; Heran, Subash S; Somogyi, Andrew A; Bowden, Jeffrey J; Doogue, Matthew P; Miners, John O

    2016-01-01

    The variable response to warfarin treatment often has a genetic basis. A protein homology model of human vitamin K epoxide reductase, subunit 1 (VKORC1), was generated to elucidate the mechanism of warfarin resistance observed in a patient with the Val66Met mutation. The VKORC1 homology model comprises four transmembrane (TM) helical domains and a half helical lid domain. Cys132 and Cys135, located in the N-terminal end of TM-4, are linked through a disulfide bond. Two distinct binding sites for warfarin were identified. Site-1, which binds vitamin K epoxide (KO) in a catalytically favorable orientation, shows higher affinity for S-warfarin compared with R-warfarin. Site-2, positioned in the domain occupied by the hydrophobic tail of KO, binds both warfarin enantiomers with similar affinity. Displacement of Arg37 occurs in the Val66Met mutant, blocking access of warfarin (but not KO) to Site-1, consistent with clinical observation of warfarin resistance.

  16. Viable tumor volume: Volume of interest within segmented metastatic lesions, a pilot study of proposed computed tomography response criteria for urothelial cancer

    International Nuclear Information System (INIS)

    Folio, Les Roger; Turkbey, Evrim B.; Steinberg, Seth M.; Apolo, Andrea B.

    2015-01-01

    Highlights: • It is clear that 2D axial measurements are incomplete assessments in metastatic disease; especially in light of evolving antiangiogenic therapies that can result in tumor necrosis. • Our pilot study demonstrates that taking volumetric density into account can better predict overall survival when compared to RECIST, volumetric size, MASS and Choi. • Although volumetric segmentation and further density analysis may not yet be feasible within routine workflows, the authors believe that technology advances may soon make this possible. - Abstract: Objectives: To evaluate the ability of new computed tomography (CT) response criteria for solid tumors such as urothelial cancer (VTV; viable tumor volume) to predict overall survival (OS) in patients with metastatic bladder cancer treated with cabozantinib. Materials and methods: We compared the relative capabilities of VTV, RECIST, MASS (morphology, attenuation, size, and structure), and Choi criteria, as well as volume measurements, to predict OS using serial follow-up contrast-enhanced CT exams in patients with metastatic urothelial carcinoma. Kaplan–Meier curves and 2-tailed log-rank tests compared OS based on early RECIST 1.1 response against each of the other criteria. A Cox proportional hazards model assessed response at follow-up exams as a time-varying covariate for OS. Results: We assessed 141 lesions in 55CT scans from 17 patients with urothelial metastasis, comparing VTV, RECIST, MASS, and Choi criteria, and volumetric measurements, for response assessment. Median follow-up was 4.5 months, range was 2–14 months. Only the VTV criteria demonstrated a statistical association with OS (p = 0.019; median OS 9.7 vs. 3.5 months). Conclusion: This pilot study suggests that VTV is a promising tool for assessing tumor response and predicting OS, using criteria that incorporate tumor volume and density in patients receiving antiangiogenic therapy for urothelial cancer. Larger studies are warranted to

  17. The Anatomical Biological Value on Pretreatment (18)F-fluorodeoxyglucose Positron Emission Tomography Computed Tomography Predicts Response and Survival in Locally Advanced Head and Neck Cancer.

    Science.gov (United States)

    Ashamalla, Hani; Mattes, Malcolm; Guirguis, Adel; Zaidi, Arifa; Mokhtar, Bahaa; Tejwani, Ajay

    2014-05-01

    (18)F-fluorodeoxyglucose positron emission tomography/computed tomography (PET/CT) has become increasingly relevant in the staging of head and neck cancers, but its prognostic value is controversial. The objective of this study was to evaluate different PET/CT parameters for their ability to predict response to therapy and survival in patients treated for head and neck cancer. A total of 28 consecutive patients with a variety of newly diagnosed head and neck cancers underwent PET/CT scanning at our institution before initiating definitive radiation therapy. All underwent a posttreatment PET/CT to gauge tumor response. Pretreatment PET/CT parameters calculated include the standardized uptake value (SUV) and the anatomical biological value (ABV), which is the product of SUV and greatest tumor diameter. Maximum and mean values were studied for both SUV and ABV, and correlated with response rate and survival. The mean pretreatment tumor ABVmax decreased from 35.5 to 7.9 (P = 0.0001). Of the parameters tested, only pretreatment ABVmax was significantly different among those patients with a complete response (CR) and incomplete response (22.8 vs. 65, respectively, P = 0.021). This difference was maximized at a cut-off ABVmax of 30 and those patients with ABVmax < 30 were significantly more likely to have a CR compared to those with ABVmax of ≥ 30 (93.8% vs. 50%, respectively, P = 0.023). The 5-year overall survival was 80% compared to 36%, respectively, (P = 0.028). Multivariate analysis confirmed that ABVmax was an independent prognostic factor. Our data supports the use of PET/CT, and specifically ABVmax, as a prognostic factor in head and neck cancer. Patients who have an ABVmax ≥ 30 were more likely to have a poor outcome with chemoradiation alone, and a more aggressive trimodality approach may be indicated in these patients.

  18. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings.

    Science.gov (United States)

    Revell, A D; Wang, D; Wood, R; Morrow, C; Tempelman, H; Hamers, R L; Alvarez-Uria, G; Streinu-Cercel, A; Ene, L; Wensing, A M J; DeWolf, F; Nelson, M; Montaner, J S; Lane, H C; Larder, B A

    2013-06-01

    Genotypic HIV drug-resistance testing is typically 60%-65% predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART without a genotype and evaluated their potential as a treatment support tool in RLSs. Random forest models were trained to predict the probability of response to ART (≤400 copies HIV RNA/mL) using the following data from 14 891 treatment change episodes (TCEs) after virological failure, from well-resourced countries: viral load and CD4 count prior to treatment change, treatment history, drugs in the new regimen, time to follow-up and follow-up viral load. Models were assessed by cross-validation during development, with an independent set of 800 cases from well-resourced countries, plus 231 cases from Southern Africa, 206 from India and 375 from Romania. The area under the receiver operating characteristic curve (AUC) was the main outcome measure. The models achieved an AUC of 0.74-0.81 during cross-validation and 0.76-0.77 with the 800 test TCEs. They achieved AUCs of 0.58-0.65 (Southern Africa), 0.63 (India) and 0.70 (Romania). Models were more accurate for data from the well-resourced countries than for cases from Southern Africa and India (P < 0.001), but not Romania. The models identified alternative, available drug regimens predicted to result in virological response for 94% of virological failures in Southern Africa, 99% of those in India and 93% of those in Romania. We developed computational models that predict virological response to ART without a genotype with comparable accuracy to genotyping with rule-based interpretation. These models have the potential to help optimize antiretroviral therapy for patients in RLSs where genotyping is not generally available.

  19. Computational Studies of the Active and Inactive Regulatory Domains of Response Regulator PhoP Using Molecular Dynamics Simulations.

    Science.gov (United States)

    Qing, Xiao-Yu; Steenackers, Hans; Venken, Tom; De Maeyer, Marc; Voet, Arnout

    2017-11-01

    The response regulator PhoP is part of the PhoP/PhoQ two-component system, which is responsible for regulating the expression of multiple genes involved in controlling virulence, biofilm formation, and resistance to antimicrobial peptides. Therefore, modulating the transcriptional function of the PhoP protein is a promising strategy for developing new antimicrobial agents. There is evidence suggesting that phosphorylation-mediated dimerization in the regulatory domain of PhoP is essential for its transcriptional function. Disruption or stabilization of protein-protein interactions at the dimerization interface may inhibit or enhance the expression of PhoP-dependent genes. In this study, we performed molecular dynamics simulations on the active and inactive dimers and monomers of the PhoP regulatory domains, followed by pocket-detecting screenings and a quantitative hot-spot analysis in order to assess the druggability of the protein. Consistent with prior hypothesis, the calculation of the binding free energy shows that phosphorylation enhances dimerization of PhoP. Furthermore, we have identified two different putative binding sites at the dimerization active site (the α4-β5-α5 face) with energetic "hot-spot" areas, which could be used to search for modulators of protein-protein interactions. This study delivers insight into the dynamics and druggability of the dimerization interface of the PhoP regulatory domain, and may serve as a basis for the rational identification of new antimicrobial drugs. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Cone-beam computed tomography for lung cancer - validation with CT and monitoring tumour response during chemo-radiation therapy.

    Science.gov (United States)

    Michienzi, Alissa; Kron, Tomas; Callahan, Jason; Plumridge, Nikki; Ball, David; Everitt, Sarah

    2017-04-01

    Cone-beam computed tomography (CBCT) is a valuable image-guidance tool in radiation therapy (RT). This study was initiated to assess the accuracy of CBCT for quantifying non-small cell lung cancer (NSCLC) tumour volumes compared to the anatomical 'gold standard', CT. Tumour regression or progression on CBCT was also analysed. Patients with Stage I-III NSCLC, prescribed 60 Gy in 30 fractions RT with concurrent platinum-based chemotherapy, routine CBCT and enrolled in a prospective study of serial PET/CT (baseline, weeks two and four) were eligible. Time-matched CBCT and CT gross tumour volumes (GTVs) were manually delineated by a single observer on MIM software, and were analysed descriptively and using Pearson's correlation coefficient (r) and linear regression (R 2 ). Of 94 CT/CBCT pairs, 30 patients were eligible for inclusion. The mean (± SD) CT GTV vs CBCT GTV on the four time-matched pairs were 95 (±182) vs 98.8 (±160.3), 73.6 (±132.4) vs 70.7 (±96.6), 54.7 (±92.9) vs 61.0 (±98.8) and 61.3 (±53.3) vs 62.1 (±47.9) respectively. Pearson's correlation coefficient (r) was 0.98 (95% CI 0.97-0.99, ρ < 0.001). The mean (±SD) CT/CBCT Dice's similarity coefficient was 0.66 (±0.16). Of 289 CBCT scans, tumours in 27 (90%) patients regressed by a mean (±SD) rate of 1.5% (±0.75) per fraction. The mean (±SD) GTV regression was 43.1% (±23.1) from the first to final CBCT. Primary lung tumour volumes observed on CBCT and time-matched CT are highly correlated (although not identical), thereby validating observations of GTV regression on CBCT in NSCLC. © 2016 The Royal Australian and New Zealand College of Radiologists.

  1. Impact of multidetector computed tomography on the diagnosis and treatment of patients with systemic inflammatory response syndrome or sepsis

    Energy Technology Data Exchange (ETDEWEB)

    Schleder, S.; Luerken, L.; Dendl, L.M.; Stroszczynski, C.; Schreyer, A.G. [University Medical Centre Regensburg, Department of Radiology, Regensburg (Germany); Redel, A. [University Medical Centre Regensburg, Department of Anaesthesiology, Regensburg (Germany); Selgrad, M. [University Medical Centre Regensburg, Department of Internal Medicine I, Regensburg (Germany); Renner, P. [University Medical Centre Regensburg, Department of Surgery, Regensburg (Germany)

    2017-11-15

    To evaluate the impact of CT scans on diagnosis or change of therapy in patients with systemic inflammatory response syndrome (SIRS) or sepsis and obscure clinical infection. CT records of patients with obscure clinical infection and SIRS or sepsis were retrospectively evaluated. Both confirmation of and changes in the diagnosis or therapy based on CT findings were analysed by means of the hospital information system and radiological information system. A sub-group analysis included differences with regard to anatomical region, medical history and referring department. Of 525 consecutive patients evaluated, 59% had been referred from internal medicine and 41% from surgery. CT examination had confirmed the suspected diagnosis in 26% and had resulted in a different diagnosis in 33% and a change of therapy in 32%. Abdominal scans yielded a significantly higher (p=0.013) change of therapy rate (42%) than thoracic scans (22%). Therapy was changed significantly more often (p=0.016) in surgical patients (38%) than in patients referred from internal medicine (28%). CT examination for detecting an unknown infection focus in patients with SIRS or sepsis is highly beneficial and should be conducted in patients with obscure clinical infection. (orig.)

  2. CONTEMPT-LT/028: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hargroves, D.W.; Metcalfe, L.J.; Wheat, L.L.; Niederauer, G.F.; Obenchain, C.F.

    1979-03-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. An annular fan model is also provided to model pressure control in the annular region of dual containment systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air--vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different

  3. Babcock and Wilcox revisions to CONTEMPT, computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hsii, Y.H.

    1975-01-01

    The CONTEMPT computer program predicts the pressure-temperature response of a single-volume reactor building to a loss-of-coolant accident. The analytical model used for the program is described. CONTEMPT assumes that the loss-of-coolant accident can be separated into two phases; the primary system blowdown and reactor building pressurization. The results of the blowdown analysis serve as the boundary conditions and are input to the CONTEMPT program. Thus, the containment model is only concerned with the pressure and temperature in the reactor building and the temperature distribution through the reactor building structures. The program also calculates building leakage and the effects of engineered safety features such as reactor building sprays, decay heat coolers, sump coolers, etc. 11 references. (U.S.)

  4. NUMEL: a computer aided design suite for the assessment of the steady state, static/dynamic stability and transient responses of nuclear steam generators

    International Nuclear Information System (INIS)

    Rowe, D.; Lightfoot, P.

    1988-02-01

    NUMEL is a computer aided design suite for the assessment of the steady state, static/dynamic stability and transient responses of nuclear steam generators. The equations solved are those of a monotube coflow or counterflow heat exchanger. The advantages of NUMEL are its fast execution speed, robustness, extensive validation and flexibility coupled with ease of use. The code can simultaneously model up to four separate sections (e.g. reheater, HP boiler). This document is a user manual and describes in detail the running of the NUMEL suite. In addition, a discussion is presented of the necessary approximations involved in representing a serpentine or helical AGR boiler as a monotube counterflow heat exchanger. To date, NUMEL has been applied to the modelling of AGR, Fast Reactor and once through Magnox and conventional boilers. Other versions of the code are available for specialist applications, e.g. Magnox and conventional recirculation boilers. (author)

  5. Sensitivity of gas filter correlation instrument to variations in optical balance. [computer program simulated the response of the GFCR to changing pollutant levels

    Science.gov (United States)

    Orr, H. D., III; Campbell, S. A.

    1975-01-01

    A computer program was used to simulate the response of the Gas Filter Correlation Radiometer (GFCR) to changing pollutant levels of CO, SO2, CH4, and NH3 in two model atmospheres. Positive and negative deviations of tau sub alpha of magnitudes 0.01, 0.1, and 1 percent were imposed upon the simulation and the resulting deviations in inferred concentrations were determined. For the CO, CH4, and the higher pressure cell of the NH3 channel, the deviations are less than + or - 12 percent for deviations in tau sub alpha of + or - 0.1 percent, but increase to significantly higher values for larger deviations. For the lower pressure cell of NH3 and for SO2, the deviations in inferred concentration begin to rise sharply between 0.01 and 0.1 percent deviation in tau sub alpha, suggesting that a tighter control on tau sub alpha may be required for these channels.

  6. Assessment of health and economic effects by PM2.5 pollution in Beijing: a combined exposure-response and computable general equilibrium analysis.

    Science.gov (United States)

    Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun

    2016-12-01

    Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.

  7. Non-linear least squares curve fitting of a simple theoretical model to radioimmunoassay dose-response data using a mini-computer

    International Nuclear Information System (INIS)

    Wilkins, T.A.; Chadney, D.C.; Bryant, J.; Palmstroem, S.H.; Winder, R.L.

    1977-01-01

    Using the simple univalent antigen univalent-antibody equilibrium model the dose-response curve of a radioimmunoassay (RIA) may be expressed as a function of Y, X and the four physical parameters of the idealised system. A compact but powerful mini-computer program has been written in BASIC for rapid iterative non-linear least squares curve fitting and dose interpolation with this function. In its simplest form the program can be operated in an 8K byte mini-computer. The program has been extensively tested with data from 10 different assay systems (RIA and CPBA) for measurement of drugs and hormones ranging in molecular size from thyroxine to insulin. For each assay system the results have been analysed in terms of (a) curve fitting biases and (b) direct comparison with manual fitting. In all cases the quality of fitting was remarkably good in spite of the fact that the chemistry of each system departed significantly from one or more of the assumptions implicit in the model used. A mathematical analysis of departures from the model's principal assumption has provided an explanation for this somewhat unexpected observation. The essential features of this analysis are presented in this paper together with the statistical analyses of the performance of the program. From these and the results obtained to date in the routine quality control of these 10 assays, it is concluded that the method of curve fitting and dose interpolation presented in this paper is likely to be of general applicability. (orig.) [de

  8. Integrin-Targeted Hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography for Imaging Tumor Progression and Early Response in Non-Small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Xiaopeng Ma

    2017-01-01

    Full Text Available Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future.

  9. Morphologic and Metabolic Comparison of Treatment Responsiveness with 18Fludeoxyglucose-Positron Emission Tomography/Computed Tomography According to Lung Cancer Type

    Directory of Open Access Journals (Sweden)

    Mehmet Fatih Börksüz

    2016-06-01

    Full Text Available Objective: The aim of the present study was to evaluate the response to treatment by histopathologic type in patients with lung cancer and under follow-up with 18F-fluoro-2deoxy-glucose-positron emission tomography/computed tomography (18F-FDG PET/CT imaging by using Response Evaluation Criteria in Solid Tumors (RECIST and European Organisation for Research and Treatment of Cancer (EORTC criteria that evaluate morphologic and metabolic parameters. Methods: On two separate (pre- and post-treatment 18F-FDG PET/CT images, the longest dimension of primary tumor as well as of secondary lesions were measured and sum of these two measurements was recorded as the total dimension in 40 patients. PET parameters such as standardized uptake value (SUVmax, metabolic volume and total lesion glycolysis (TLG were also recorded for these target lesions on two separate 18F-FDG PET/CT images. The percent (% change was calculated for all these parameters. Morphologic evaluation was based on RECIST 1.1 and the metabolic evaluation was based on EORTC. Results: When evaluated before and after treatment, in spite of the statistically significant change (p0.05. In histopathologic typing, when we compare the post-treatment phase change with the treatment responses of RECIST 1.1 and EORTC criteria; for RECIST 1.1 in squamous cell lung cancer group, progression was observed in sixteen patients (57%, stability in seven patients (25%, partial response in five patients (18%; and for EORTC progression was detected in four patients (14%, stability in thirteen patients (47%, partial response in eleven patients (39%, in 12 of these patients an increase in stage (43%, in 4 of them a decrease in stage (14%, and in 12 of them stability in stage (43% were determined. But in adenocancer patients (n=7, for RECIST 1.1, progression was determined in four patients (57%, stability in two patients (29%, partial response in one patient (14%; for EORTC, progression in one patient (14

  10. Positron emission tomography response criteria in solid tumours criteria for quantitative analysis of [18F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography for treatment response assessment in metastasised solid tumours: All that glitters is not gold.

    Science.gov (United States)

    Willemsen, Annelieke E C A B; Vlenterie, Myrella; van Herpen, Carla M L; van Erp, Nielka P; van der Graaf, Winette T A; de Geus-Oei, Lioe-Fee; Oyen, Wim J G

    2016-03-01

    For solid tumours, quantitative analysis of [(18)F]-fluorodeoxyglucose positron emission tomography with integrated computed tomography potentially can have significant value in early response assessment and thereby discrimination between responders and non-responders at an early stage of treatment. Standardised strategies for this analysis have been proposed, and the positron emission tomography response criteria in solid tumours (PERCIST) criteria can be regarded as the current standard to perform quantitative analysis in a research setting, yet is not implemented in daily practice. However, several exceptions and limitations limit the feasibility of PERCIST criteria. In this article, we point out dilemmas that arise when applying proposed criteria like PERCIST on an expansive set of patients with metastasised solid tumours. Clinicians and scientists should be aware of these limitations to prevent that methodological issues impede successful introduction of research data into clinical practice. Therefore, to deliver on the high potential of quantitative imaging, consensus should be reached on a standardised, feasible and clinically useful analysis methodology. This methodology should be applicable in the majority of patients, tumour types and treatments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Computational Pipeline for NIRS-EEG Joint Imaging of tDCS-Evoked Cerebral Responses-An Application in Ischemic Stroke.

    Science.gov (United States)

    Guhathakurta, Debarpan; Dutta, Anirban

    2016-01-01

    Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about -15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization.

  12. Lymphocyte density determined by computational pathology validated as a predictor of response to neoadjuvant chemotherapy in breast cancer: secondary analysis of the ARTemis trial.

    Science.gov (United States)

    Ali, H R; Dariush, A; Thomas, J; Provenzano, E; Dunn, J; Hiller, L; Vallier, A-L; Abraham, J; Piper, T; Bartlett, J M S; Cameron, D A; Hayward, L; Brenton, J D; Pharoah, P D P; Irwin, M J; Walton, N A; Earl, H M; Caldas, C

    2017-08-01

    We have previously shown lymphocyte density, measured using computational pathology, is associated with pathological complete response (pCR) in breast cancer. The clinical validity of this finding in independent studies, among patients receiving different chemotherapy, is unknown. The ARTemis trial randomly assigned 800 women with early stage breast cancer between May 2009 and January 2013 to three cycles of docetaxel, followed by three cycles of fluorouracil, epirubicin and cyclophosphamide once every 21 days with or without four cycles of bevacizumab. The primary endpoint was pCR (absence of invasive cancer in the breast and lymph nodes). We quantified lymphocyte density within haematoxylin and eosin (H&E) whole slide images using our previously described computational pathology approach: for every detected lymphocyte the average distance to the nearest 50 lymphocytes was calculated and the density derived from this statistic. We analyzed both pre-treatment biopsies and post-treatment surgical samples of the tumour bed. Of the 781 patients originally included in the primary endpoint analysis of the trial, 609 (78%) were included for baseline lymphocyte density analyses and a subset of 383 (49% of 781) for analyses of change in lymphocyte density. The main reason for loss of patients was the availability of digitized whole slide images. Pre-treatment lymphocyte density modelled as a continuous variable was associated with pCR on univariate analysis (odds ratio [OR], 2.92; 95% CI, 1.78-4.85; P < 0.001) and after adjustment for clinical covariates (OR, 2.13; 95% CI, 1.24-3.67; P = 0.006). Increased pre- to post-treatment lymphocyte density showed an independent inverse association with pCR (adjusted OR, 0.1; 95% CI, 0.033-0.31; P < 0.001). Lymphocyte density in pre-treatment biopsies was validated as an independent predictor of pCR in breast cancer. Computational pathology is emerging as a viable and objective means of identifying predictive biomarkers

  13. An Idle-State Detection Algorithm for SSVEP-Based Brain-Computer Interfaces Using a Maximum Evoked Response Spatial Filter.

    Science.gov (United States)

    Zhang, Dan; Huang, Bisheng; Wu, Wei; Li, Siliang

    2015-11-01

    Although accurate recognition of the idle state is essential for the application of brain-computer interfaces (BCIs) in real-world situations, it remains a challenging task due to the variability of the idle state. In this study, a novel algorithm was proposed for the idle state detection in a steady-state visual evoked potential (SSVEP)-based BCI. The proposed algorithm aims to solve the idle state detection problem by constructing a better model of the control states. For feature extraction, a maximum evoked response (MER) spatial filter was developed to extract neurophysiologically plausible SSVEP responses, by finding the combination of multi-channel electroencephalogram (EEG) signals that maximized the evoked responses while suppressing the unrelated background EEGs. The extracted SSVEP responses at the frequencies of both the attended and the unattended stimuli were then used to form feature vectors and a series of binary classifiers for recognition of each control state and the idle state were constructed. EEG data from nine subjects in a three-target SSVEP BCI experiment with a variety of idle state conditions were used to evaluate the proposed algorithm. Compared to the most popular canonical correlation analysis-based algorithm and the conventional power spectrum-based algorithm, the proposed algorithm outperformed them by achieving an offline control state classification accuracy of 88.0 ± 11.1% and idle state false positive rates (FPRs) ranging from 7.4 ± 5.6% to 14.2 ± 10.1%, depending on the specific idle state conditions. Moreover, the online simulation reported BCI performance close to practical use: 22.0 ± 2.9 out of the 24 control commands were correctly recognized and the FPRs achieved as low as approximately 0.5 event/min in the idle state conditions with eye open and 0.05 event/min in the idle state condition with eye closed. These results demonstrate the potential of the proposed algorithm for implementing practical SSVEP BCI systems.

  14. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  15. Babcock and Wilcox revisions to CONTEMPT, computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hsii, Y.H.

    1976-06-01

    The CONTEMPT computer program predicts the pressure-temperature response of a single-volume reactor building to a loss-of-coolant accident. The report describes the analytical model used for the program. CONTEMPT assumes that the loss-of-coolant accident can be separated into two phases; the primary system blowdown and reactor building pressurization. The results of the blowdown analysis serve as the boundary conditions and are input to the CONTEMPT program. Thus, the containment model is only concerned with the pressure and temperature in the reactor building and the temperature distribution through the reactor building structures. The user is required to input the description of the discharge of coolant, the boiling of residual water by reactor decay heat, the superheating of steam passing through the core, and metal-water reactions. The reactor building is separated into liquid and vapor regions. Each region is in thermal equilibrium itself, but the two may not be in thermal equilibrium; the liquid and gaseous regions may have different temperatures. The reactor building is represented as consisting of several heat-conducting structures whose thermal behavior can be described by the one-dimensional multi-region heat conduction equation. The program also calculates building leakage and the effects of engineered safety features such as reactor building sprays, decay heat coolers, sump coolers, etc

  16. Primary pulmonary lymphoma-role of fluoro-deoxy-glucose positron emission tomography-computed tomography in the initial staging and evaluating response to treatment - case reports and review of literature

    International Nuclear Information System (INIS)

    Agarwal, Krishan Kant; Dhanapathi, Halanaik; Nazar, Aftab Hasan; Kumar, Rakesh

    2016-01-01

    Primary pulmonary lymphoma (PPL) is an uncommon entity of non-Hodgkin lymphoma, which accounts for <1% of all cases of lymphoma. We present two rare cases of PPL of diffuse large B-cell lymphoma, which underwent 18 fluorine fluoro-deoxy-glucose positron emission tomography-computed tomography for initial staging and response evaluation after chemotherapy

  17. Shallow landslide stability computation using a distributed transient response model for susceptibility assessment and validation. A case study from Ribeira Quente valley (S. Miguel island, Azores)

    Science.gov (United States)

    Amaral, P.; Marques, R.; Zêzere, J. L.; Marques, F.; Queiroz, G.

    2009-04-01

    based system that will publish the FS values to a WebGIS platform, based on near real time ground-based rainfall monitoring. This application will allow the evaluation of scenarios considering the variation of the pressure head response, related to transient rainfall regime. The resultant computational platform combined with regional empirical rainfall triggered landslides threshold (Marques et al. 2008) can be incorporated in a common server with the Regional Civil Protection for emergency planning purposes. This work is part of the project VOLCSOILRISK (Volcanic Soils Geotechnical Characterization for Landslide Risk Mitigation), supported by Direcção Regional da Ciência e Tecnologia do Governo Regional dos Açores. References: IVERSON, R.M. (2000) - Landslide triggering by rain infiltration. Water Resources Research 36, 1897-1910. MARQUES, R., ZÊZERE, J.L., TRIGO, R., GASPAR, J.L., TRIGO, I. (2008) - Rainfall patterns and critical values associated with landslides in Povoação County (São Miguel Island, Azores): relationships with the North Atlantic Oscillation. Hydrol. Process. 22, 478-494. DOI: 10.1002/hyp.6879.

  18. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to Movie Trailers

    Science.gov (United States)

    Christoforou, Christoforos; Papadopoulos, Timothy C.; Constantinidou, Fofi; Theodorou, Maria

    2017-01-01

    The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer’s decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups) have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG) and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers’ decision to watch a movie. Towards that, we recorded neural activity—through the use of EEG—and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie’s market performance (i.e., box-office ticket sales in the US). Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films’ performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain’s rewards network. Beyond the practical

  19. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer's Brain Responses to Movie Trailers.

    Science.gov (United States)

    Christoforou, Christoforos; Papadopoulos, Timothy C; Constantinidou, Fofi; Theodorou, Maria

    2017-01-01

    The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer's decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups) have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG) and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers' decision to watch a movie. Towards that, we recorded neural activity-through the use of EEG-and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie's market performance (i.e., box-office ticket sales in the US). Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films' performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain's rewards network. Beyond the practical implication in

  20. Your Brain on the Movies: A Computational Approach for Predicting Box-office Performance from Viewer’s Brain Responses to Movie Trailers

    Directory of Open Access Journals (Sweden)

    Christoforos Christoforou

    2017-12-01

    Full Text Available The ability to anticipate the population-wide response of a target audience to a new movie or TV series, before its release, is critical to the film industry. Equally important is the ability to understand the underlying factors that drive or characterize viewer’s decision to watch a movie. Traditional approaches (which involve pilot test-screenings, questionnaires, and focus groups have reached a plateau in their ability to predict the population-wide responses to new movies. In this study, we develop a novel computational approach for extracting neurophysiological electroencephalography (EEG and eye-gaze based metrics to predict the population-wide behavior of movie goers. We further, explore the connection of the derived metrics to the underlying cognitive processes that might drive moviegoers’ decision to watch a movie. Towards that, we recorded neural activity—through the use of EEG—and eye-gaze activity from a group of naive individuals while watching movie trailers of pre-selected movies for which the population-wide preference is captured by the movie’s market performance (i.e., box-office ticket sales in the US. Our findings show that the neural based metrics, derived using the proposed methodology, carry predictive information about the broader audience decisions to watch a movie, above and beyond traditional methods. In particular, neural metrics are shown to predict up to 72% of the variance of the films’ performance at their premiere and up to 67% of the variance at following weekends; which corresponds to a 23-fold increase in prediction accuracy compared to current neurophysiological or traditional methods. We discuss our findings in the context of existing literature and hypothesize on the possible connection of the derived neurophysiological metrics to cognitive states of focused attention, the encoding of long-term memory, and the synchronization of different components of the brain’s rewards network. Beyond the

  1. A New Computational Model for Neuro-Glio-Vascular Coupling: Astrocyte Activation Can Explain Cerebral Blood Flow Nonlinear Response to Interictal Events.

    Directory of Open Access Journals (Sweden)

    Solenna Blanchard

    Full Text Available Developing a clear understanding of the relationship between cerebral blood flow (CBF response and neuronal activity is of significant importance because CBF increase is essential to the health of neurons, for instance through oxygen supply. This relationship can be investigated by analyzing multimodal (fMRI, PET, laser Doppler… recordings. However, the important number of intermediate (non-observable variables involved in the underlying neurovascular coupling makes the discovery of mechanisms all the more difficult from the sole multimodal data. We present a new computational model developed at the population scale (voxel with physiologically relevant but simple equations to facilitate the interpretation of regional multimodal recordings. This model links neuronal activity to regional CBF dynamics through neuro-glio-vascular coupling. This coupling involves a population of glial cells called astrocytes via their role in neurotransmitter (glutamate and GABA recycling and their impact on neighboring vessels. In epilepsy, neuronal networks generate epileptiform discharges, leading to variations in astrocytic and CBF dynamics. In this study, we took advantage of these large variations in neuronal activity magnitude to test the capacity of our model to reproduce experimental data. We compared simulations from our model with isolated epileptiform events, which were obtained in vivo by simultaneous local field potential and laser Doppler recordings in rats after local bicuculline injection. We showed a predominant neuronal contribution for low level discharges and a significant astrocytic contribution for higher level discharges. Besides, neuronal contribution to CBF was linear while astrocytic contribution was nonlinear. Results thus indicate that the relationship between neuronal activity and CBF magnitudes can be nonlinear for isolated events and that this nonlinearity is due to astrocytic activity, highlighting the importance of astrocytes in

  2. An Agent-Based Model of a Hepatic Inflammatory Response to Salmonella: A Computational Study under a Large Set of Experimental Data.

    Science.gov (United States)

    Shi, Zhenzhen; Chapes, Stephen K; Ben-Arieh, David; Wu, Chih-Hang

    2016-01-01

    We present an agent-based model (ABM) to simulate a hepatic inflammatory response (HIR) in a mouse infected by Salmonella that sometimes progressed to problematic proportions, known as "sepsis". Based on over 200 published studies, this ABM describes interactions among 21 cells or cytokines and incorporates 226 experimental data sets and/or data estimates from those reports to simulate a mouse HIR in silico. Our simulated results reproduced dynamic patterns of HIR reported in the literature. As shown in vivo, our model also demonstrated that sepsis was highly related to the initial Salmonella dose and the presence of components of the adaptive immune system. We determined that high mobility group box-1, C-reactive protein, and the interleukin-10: tumor necrosis factor-α ratio, and CD4+ T cell: CD8+ T cell ratio, all recognized as biomarkers during HIR, significantly correlated with outcomes of HIR. During therapy-directed silico simulations, our results demonstrated that anti-agent intervention impacted the survival rates of septic individuals in a time-dependent manner. By specifying the infected species, source of infection, and site of infection, this ABM enabled us to reproduce the kinetics of several essential indicators during a HIR, observe distinct dynamic patterns that are manifested during HIR, and allowed us to test proposed therapy-directed treatments. Although limitation still exists, this ABM is a step forward because it links underlying biological processes to computational simulation and was validated through a series of comparisons between the simulated results and experimental studies.

  3. Evaluation of tumor response to intra-arterial chemoembolization of hepatocellular carcinoma: Comparison of contrast-enhanced ultrasound with multiphase computed tomography.

    Science.gov (United States)

    Paul, S B; Dhamija, E; Gamanagatti, S R; Sreenivas, V; Yadav, D P; Jain, S; Shalimar; Acharya, S K

    2017-03-01

    To compare the diagnostic accuracy of contrast-enhanced ultrasound (CEUS) with that of multiphase computed tomography (CT) in the evaluation of tumor response to transarterial chemoembolization (TACE) of hepatocellular carcinoma (HCC). Fifty patients (41 men, 9 women; mean age, 53 years±12.5 [SD]) with a total of 70 HCCs (mean size, 5cm±3 [SD]) were evaluated. Post-TACE therapeutic assessment of HCC was done at 4 weeks. Patients with TACE done earlier and reporting with suspicion for recurrence were also included. Patients with hepatic masses seen on ultrasound were enrolled and subjected to CEUS, multiphase CT and magnetic resonance imaging (MRI). Hyperenhancing area at the tumor site on arterial phase of CEUS/multiphase CT/MRI was termed as residual disease (RD), the patterns of which were described on CEUS. Diagnostic accuracies of CEUS and MPCT were compared to that of MRI that was used as the reference standard. CEUS detected RD in 43/70 HCCs (61%). RD had a heterogeneous pattern in 22/43 HCCs (51%). Sensitivities of CEUS and multiphase CT were 94% (34/36; 95% CI: 81-99%) and 50% (18/36; 95% CI: 33-67%) respectively. Significant difference in sensitivity was found between CEUS and multiphase CT (P=0.0001). CEUS and multiphase CT had 100% specificity (95% CI: 83-100%). CEUS is a useful technique for detecting RD in HCC after TACE. For long term surveillance, CEUS should be complemented with multiphase CT/MRI for a comprehensive evaluation. Copyright © 2016 Éditions françaises de radiologie. Published by Elsevier Masson SAS. All rights reserved.

  4. The Effect of Device When Using Smartphones and Computers to Answer Multiple-Choice and Open-Response Questions in Distance Education

    Science.gov (United States)

    Wilson, Thomas Royce

    2017-01-01

    Traditionally in higher education, online courses have been designed for computer users. However, the advent of mobile learning (m-learning) and the proliferation of smartphones have created two challenges for online students and instructional designers. First, instruction designed for a larger computer screen often loses its effectiveness when…

  5. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  6. Response Across the Health-Literacy Spectrum of Kidney Transplant Recipients to a Sun-Protection Education Program Delivered on Tablet Computers: Randomized Controlled Trial.

    Science.gov (United States)

    Robinson, June K; Friedewald, John J; Desai, Amishi; Gordon, Elisa J

    2015-08-18

    Sun protection can reduce skin cancer development in kidney transplant recipients, who have a greater risk of developing squamous cell carcinoma than the general population. A culturally sensitive sun-protection program (SunProtect) was created in English and Spanish with the option of choosing audio narration provided by the tablet computer (Samsung Galaxy Tab 2 10.1). The intervention, which showed skin cancer on patients with various skin tones, explained the following scenarios: skin cancer risk, the ability of sun protection to reduce this risk, as well as offered sun-protection choices. The length of the intervention was limited to the time usually spent waiting during a visit to the nephrologist. The development of this culturally sensitive, electronic, interactive sun-protection educational program, SunProtect, was guided by the "transtheoretical model," which focuses on decision making influenced by perceptions of personal risk or vulnerability to a health threat, importance (severity) of the disease, and benefit of sun-protection behavior. Transportation theory, which holds that narratives can have uniquely persuasive effects in overcoming preconceived beliefs and cognitive biases because people transported into a narrative world will alter their beliefs based on information, claims, or events depicted, guided the use of testimonials. Participant tablet use was self-directed. Self-reported responses to surveys were entered into the database through the tablet. Usability was tested through interviews. A randomized controlled pilot trial with 170 kidney transplant recipients was conducted, where the educational program (SunProtect) was delivered through a touch-screen tablet to 84 participants. The study involved 62 non-Hispanic white, 60 non-Hispanic black, and 48 Hispanic/Latino kidney transplant recipients. The demographic survey data showed no significant mean differences between the intervention and control groups in age, sex, income, or time since

  7. SCINFUL: A Monte Carlo based computer program to determine a scintillator full energy response to neutron detection for E/sub n/ between 0. 1 and 80 MeV: Program development and comparisons of program predictions with experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Dickens, J.K.

    1988-04-01

    This document provides a discussion of the development of the FORTRAN Monte Carlo program SCINFUL (for scintillator full response), a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The program may also be used to compute angle-integrated spectra of charged particles (p, d, t, /sup 3/He, and ..cap alpha..) following neutron interactions with /sup 12/C. Extensive comparisons with a variety of experimental data are given. There is generally overall good agreement (<10% differences) of results from SCINFUL calculations with measured detector responses, i.e., N(E/sub r/) vs E/sub r/ where E/sub r/ is the response pulse height, reproduce measured detector responses with an accuracy which, at least partly, depends upon how well the experimental configuration is known. For E/sub n/ < 16 MeV and for E/sub r/ > 15% of the maximum pulse height response, calculated spectra are within +-5% of experiment on the average. For E/sub n/ up to 50 MeV similar good agreement is obtained with experiment for E/sub r/ > 30% of maximum response. For E/sub n/ up to 75 MeV the calculated shape of the response agrees with measurements, but the calculations underpredicts the measured response by up to 30%. 65 refs., 64 figs., 3 tabs.

  8. Routine Self-administered, Touch-Screen Computer Based Suicidal Ideation Assessment Linked to Automated Response Team Notification in an HIV Primary Care Setting

    Science.gov (United States)

    Lawrence, Sarah T.; Willig, James H.; Crane, Heidi M.; Ye, Jiatao; Aban, Inmaculada; Lober, William; Nevin, Christa R.; Batey, D. Scott; Mugavero, Michael J.; McCullumsmith, Cheryl; Wright, Charles; Kitahata, Mari; Raper, James L.; Saag, Micheal S.; Schumacher, Joseph E.

    2010-01-01

    Summary The implementation of routine computer-based screening for suicidal ideation and other psychosocial domains through standardized patient reported outcome instruments in two high volume urban HIV clinics is described. Factors associated with an increased risk of self-reported suicidal ideation were determined. Background HIV/AIDS continues to be associated with an under-recognized risk for suicidal ideation, attempted as well as completed suicide. Suicidal ideation represents an important predictor for subsequent attempted and completed suicide. We sought to implement routine screening of suicidal ideation and associated conditions using computerized patient reported outcome (PRO) assessments. Methods Two geographically distinct academic HIV primary care clinics enrolled patients attending scheduled visits from 12/2005 to 2/2009. Touch-screen-based, computerized PRO assessments were implemented into routine clinical care. Substance abuse (ASSIST), alcohol consumption (AUDIT-C), depression (PHQ-9) and anxiety (PHQ-A) were assessed. The PHQ-9 assesses the frequency of suicidal ideation in the preceding two weeks. A response of “nearly every day” triggered an automated page to pre-determined clinic personnel who completed more detailed self-harm assessments. Results Overall 1,216 (UAB= 740; UW= 476) patients completed initial PRO assessment during the study period. Patients were white (53%; n=646), predominantly males (79%; n=959) with a mean age of 44 (± 10). Among surveyed patients, 170 (14%) endorsed some level of suicidal ideation, while 33 (3%) admitted suicidal ideation nearly every day. In multivariable analysis, suicidal ideation risk was lower with advancing age (OR=0.74 per 10 years;95%CI=0.58-0.96) and was increased with current substance abuse (OR=1.88;95%CI=1.03-3.44) and more severe depression (OR=3.91 moderate;95%CI=2.12-7.22; OR=25.55 severe;95%CI=12.73-51.30). Discussion Suicidal ideation was associated with current substance abuse and

  9. Dual-energy computed tomography to assess tumor response to hepatic radiofrequency ablation: potential diagnostic value of virtual noncontrast images and iodine maps.

    Science.gov (United States)

    Lee, Su Hyun; Lee, Jeong Min; Kim, Kyung Won; Klotz, Ernst; Kim, Se Hyung; Lee, Jae Young; Han, Joon Koo; Choi, Byung Ihn

    2011-02-01

    to determine the value of dual-energy (DE) scanning with virtual noncontrast (VNC) images and iodine maps in the evaluation of therapeutic response to radiofrequency ablation (RFA) for hepatic tumors. a total of 75 patients with hepatic tumors and who underwent DE computed tomography (CT) after RFA, were enrolled in this study. Our DE CT protocol included precontrast, arterial, and portal phase scans. VNC images and iodine maps were created from 80 to 140 kVp images during the arterial and portal phases. VNC images were then compared with true, noncontrast (TNC) images, and iodine maps were compared with linearly blended images, both qualitatively and quantitatively. For the former comparison, image quality and acceptability of the VNC images as a replacement for TNC images were both rated. The CT numbers of the hepatic parenchyma, ablation zone, and image noise were measured. For the latter comparison, lesion conspicuity of the ablation zone and the additional benefit of integrating the iodine map into the routine protocol, were assessed. Contrast-to-noise ratios (CNR) of the ablation zone-to-liver and aorta-to-liver as well as the CT number differences between the center and the periphery of the ablation zone were calculated. The image quality of the VNC images was rated as good (mean grading score, 1.88) and the level of acceptance was 90% (68/75). The mean CT numbers of the hepatic parenchyma and ablation zone did not differ significantly between the TNC and the VNC images (P > 0.05). The lesion conspicuity of the ablation zone was rated as excellent or good in 97% of the iodine map (73/75), and the additional benefits of the iodine maps were positively rated as better to the same (mean 1.5). The CNR of the aorta-to-liver parenchyma was significantly higher on the iodine map (P = 0.002), and the CT number differences between the center and the periphery of the ablation zone were significantly lower on the iodine map (P VNC images can be an alternative to TNC

  10. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  11. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  12. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  13. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  14. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  15. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  16. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings

    NARCIS (Netherlands)

    Revell, A. D.; Wang, D.; Wood, R.; Morrow, C.; Tempelman, H.; Hamers, R. L.; Alvarez-Uria, G.; Streinu-Cercel, A.; Ene, L.; Wensing, A. M. J.; DeWolf, F.; Nelson, M.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2013-01-01

    Genotypic HIV drug-resistance testing is typically 6065 predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART

  17. Computational response study of personal and albedo neutron dosemeters composed of solid state track detectors based on (n,α) reaction

    International Nuclear Information System (INIS)

    Palfalvi, J.

    1984-03-01

    The combined effect of incident and albedo neutrons on the response of several fission and (n,α) track detectors was investigated by calculations for monoenergetic neutrons and for neutrons from different energetic sources. The response functions are presented in tables and plots. (author)

  18. Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis: Incorporating Protein Synthesis in Improving Predictability of Responses to Endocrine Active Chemicals

    Science.gov (United States)

    There is international concern about chemicals that alter endocrine system function in humans and/or wildlife and subsequently cause adverse effects. We previously developed a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minno...

  19. A computer simulation of the transient response of a 4 cylinder Stirling engine with burner and air preheater in a vehicle

    Science.gov (United States)

    Martini, W. R.

    1981-01-01

    A series of computer programs are presented with full documentation which simulate the transient behavior of a modern 4 cylinder Siemens arrangement Stirling engine with burner and air preheater. Cold start, cranking, idling, acceleration through 3 gear changes and steady speed operation are simulated. Sample results and complete operating instructions are given. A full source code listing of all programs are included.

  20. Computed Tomography (CT) Perfusion as an Early Predictive Marker for Treatment Response to Neoadjuvant Chemotherapy in Gastroesophageal Junction Cancer and Gastric Cancer - A Prospective Study

    DEFF Research Database (Denmark)

    Lundsgaard Hansen, Martin; Fallentin, Eva; Lauridsen, Carsten

    2014-01-01

    OBJECTIVES: To evaluate whether early reductions in CT perfusion parameters predict response to pre-operative chemotherapy prior to surgery for gastroesophageal junction (GEJ) and gastric cancer. MATERIALS AND METHODS: Twenty-eight patients with adenocarcinoma of the gastro-esophageal junction (GEJ......-operative chemotherapy in GEJ and gastric cancer. As a single diagnostic test, CT Perfusion only has moderate sensitivity and specificity in response assessment of pre-operative chemotherapy making it insufficient for clinical decision purposes....

  1. Comparison of RECIST version 1.0 and 1.1 in assessment of tumor response by computed tomography in advanced gastric cancer.

    Science.gov (United States)

    Jang, Gil-Su; Kim, Min-Jeong; Ha, Hong-Il; Kim, Jung Han; Kim, Hyeong Su; Ju, Sung Bae; Zang, Dae Young

    2013-12-01

    Response Evaluation Criteria in Solid Tumors (RECIST) guideline version 1.0 (RECIST 1.0) was proposed as a new guideline for evaluating tumor response and has been widely accepted as a standardized measure. With a number of issues being raised on RECIST 1.0, however, a revised RECIST guideline version 1.1 (RECIST 1.1) was proposed by the RECIST Working Group in 2009. This study was conducted to compare CT tumor response based on RECIST 1.1 vs. RECIST 1.0 in patients with advanced gastric cancer (AGC). We reviewed 61 AGC patients with measurable diseases by RECIST 1.0 who were enrolled in other clinical trials between 2008 and 2010. These patients were retrospectively re-analyzed to determine the concordance between the two response criteria using the κ statistic. The number and sum of tumor diameters of the target lesions by RECIST 1.1 were significantly lower than those by RECIST 1.0 (P<0.0001). However, there was excellent agreement in tumor response between RECIST 1.1 and RECIST 1.0 (κ=0.844). The overall response rates (ORRs) according to RECIST 1.0 and RECIST 1.1 were 32.7% (20/61) and 34.5% (20/58), respectively. One patient with partial response (PR) based on RECIST 1.0 was reclassified as stable disease (SD) by RECIST 1.1. Of two patients with SD by RECIST 1.0, one was downgraded to progressive disease and the other was upgraded to PR by RECIST 1.1. RECIST 1.1 provided almost perfect agreement with RECIST 1.0 in the CT assessment of tumor response of AGC.

  2. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  3. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  4. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  5. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  6. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  7. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  8. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  9. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  10. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  11. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  12. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  13. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  14. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  15. A computational methodology for a micro launcher engine test bench using a combined linear static and dynamic in frequency response analysis

    Directory of Open Access Journals (Sweden)

    Ion DIMA

    2017-03-01

    Full Text Available This article aims to provide a quick methodology to determine the critical values of the forces, displacements and stress function of frequency, under a combined linear static (101 Solution - Linear Static and dynamic load in frequency response (108 Solution - Frequency Response, Direct Method, applied to a micro launcher engine test bench, using NASTRAN 400 Solution - Implicit Nonlinear. NASTRAN/PATRAN software is used. Practically in PATRAN the preprocessor has to define a linear or nonlinear static load at step 1 and a dynamic in frequency response load (time dependent at step 2. In Analyze the following options are chosen: for Solution Type Implicit Nonlinear Solution (SOL 400 is selected, for Subcases Static Load and Transient Dynamic is chosen and for Subcase Select the two cases static and dynamic will be selected. NASTRAN solver will overlap results from static analysis with the dynamic analysis. The running time will be reduced three times if using Krylov solver. NASTRAN SYSTEM (387 = -1 instruction is used in order to activate Krylov option. Also, in Analysis the OP2 Output Format shall be selected, meaning that in bdf NASTRAN input file the PARAM POST 1 instruction shall be written. The structural damping can be defined in two different ways: either at the material card or using the PARAM, G, 0.05 instruction (in this example a damping coefficient by 5% was used. The SDAMPING instruction in pair with TABDMP1 work only for dynamic in frequency response, modal method, or in direct method with viscoelastic material, not for dynamic in frequency response, direct method (DFREQ, with linear elastic material. The Direct method – DFREQ used in this example is more accurate. A set in translation of boundary conditions was used and defined at the base of the test bench.

  16. Hemodynamic response to exercise and head-up tilt of patients implanted with a rotary blood pump: a computational modeling study.

    Science.gov (United States)

    Lim, Einly; Salamonsen, Robert Francis; Mansouri, Mahdi; Gaddum, Nicholas; Mason, David Glen; Timms, Daniel L; Stevens, Michael Charles; Fraser, John; Akmeliawati, Rini; Lovell, Nigel Hamilton

    2015-02-01

    The present study investigates the response of implantable rotary blood pump (IRBP)-assisted patients to exercise and head-up tilt (HUT), as well as the effect of alterations in the model parameter values on this response, using validated numerical models. Furthermore, we comparatively evaluate the performance of a number of previously proposed physiologically responsive controllers, including constant speed, constant flow pulsatility index (PI), constant average pressure difference between the aorta and the left atrium, constant average differential pump pressure, constant ratio between mean pump flow and pump flow pulsatility (ratioP I or linear Starling-like control), as well as constant left atrial pressure ( P l a ¯ ) control, with regard to their ability to increase cardiac output during exercise while maintaining circulatory stability upon HUT. Although native cardiac output increases automatically during exercise, increasing pump speed was able to further improve total cardiac output and reduce elevated filling pressures. At the same time, reduced venous return associated with upright posture was not shown to induce left ventricular (LV) suction. Although P l a ¯ control outperformed other control modes in its ability to increase cardiac output during exercise, it caused a fall in the mean arterial pressure upon HUT, which may cause postural hypotension or patient discomfort. To the contrary, maintaining constant average pressure difference between the aorta and the left atrium demonstrated superior performance in both exercise and HUT scenarios. Due to their strong dependence on the pump operating point, PI and ratioPI control performed poorly during exercise and HUT. Our simulation results also highlighted the importance of the baroreflex mechanism in determining the response of the IRBP-assisted patients to exercise and postural changes, where desensitized reflex response attenuated the percentage increase in cardiac output during exercise and

  17. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  18. Development of a dual-energy computed tomography quality control program: Characterization of scanner response and definition of relevant parameters for a fast-kVp switching dual-energy computed tomography system.

    Science.gov (United States)

    Nute, Jessica L; Jacobsen, Megan C; Stefan, Wolfgang; Wei, Wei; Cody, Dianna D

    2018-04-01

    A prototype QC phantom system and analysis process were developed to characterize the spectral capabilities of a fast kV-switching dual-energy computed tomography (DECT) scanner. This work addresses the current lack of quantitative oversight for this technology, with the goal of identifying relevant scan parameters and test metrics instrumental to the development of a dual-energy quality control (DEQC). A prototype elliptical phantom (effective diameter: 35 cm) was designed with multiple material inserts for DECT imaging. Inserts included tissue equivalent and material rods (including iodine and calcium at varying concentrations). The phantom was scanned on a fast kV-switching DECT system using 16 dual-energy acquisitions (CTDIvol range: 10.3-62 mGy) with varying pitch, rotation time, and tube current. The circular head phantom (22 cm diameter) was scanned using a similar protocol (12 acquisitions; CTDIvol range: 36.7-132.6 mGy). All acquisitions were reconstructed at 50, 70, 110, and 140 keV and using a water-iodine material basis pair. The images were evaluated for iodine quantification accuracy, stability of monoenergetic reconstruction CT number, noise, and positional constancy. Variance component analysis was used to identify technique parameters that drove deviations in test metrics. Variances were compared to thresholds derived from manufacturer tolerances to determine technique parameters that had a nominally significant effect on test metrics. Iodine quantification error was largely unaffected by any of the technique parameters investigated. Monoenergetic HU stability was found to be affected by mAs, with a threshold under which spectral separation was unsuccessful, diminishing the utility of DECT imaging. Noise was found to be affected by CTDIvol in the DEQC body phantom, and CTDIvol and mA in the DEQC head phantom. Positional constancy was found to be affected by mAs in the DEQC body phantom and mA in the DEQC head phantom. A streamlined scan protocol

  19. JAC2D: A two-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method

    International Nuclear Information System (INIS)

    Biffle, J.H.; Blanford, M.L.

    1994-05-01

    JAC2D is a two-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equations. The method is implemented in a two-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. A four-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic/plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere

  20. JAC3D -- A three-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method

    International Nuclear Information System (INIS)

    Biffle, J.H.

    1993-02-01

    JAC3D is a three-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equation. The method is implemented in a three-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. An eight-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic-plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere

  1. The influence of tube voltage and phantom size in computed tomography on the dose-response relationship of dicentrics in human blood samples

    International Nuclear Information System (INIS)

    Jost, G; Pietsch, H; Lengsfeld, P; Voth, M; Schmid, E

    2010-01-01

    The aim of this study was to investigate the dose response relationship of dicentrics in human lymphocytes after CT scans at tube voltages of 80 and 140 kV. Blood samples from a healthy donor placed in tissue equivalent abdomen phantoms of standard, pediatric and adipose sizes were exposed at dose levels up to 0.1 Gy using a 64-slice CT scanner. It was found that both the tube voltage and the phantom size significantly influenced the CT scan-induced linear dose-response relationship of dicentrics in human lymphocytes. Using the same phantom (standard abdomen), 80 kV CT x-rays were biologically more effective than 140 kV CT x-rays. However, it could also be determined that the applied phantom size had much more influence on the biological effectiveness. Obviously, the increasing slopes of the CT scan-induced dose response relationships of dicentrics in human lymphocytes obtained in a pediatric, a standard and an adipose abdomen have been induced by scattering effects of photons, which strongly increase with increasing phantom size.

  2. The influence of tube voltage and phantom size in computed tomography on the dose-response relationship of dicentrics in human blood samples

    Energy Technology Data Exchange (ETDEWEB)

    Jost, G; Pietsch, H [TRG Diagnostic Imaging, Bayer Schering Pharma AG, Berlin (Germany); Lengsfeld, P; Voth, M [Global Medical Affairs Diagnostic Imaging, Bayer Schering Pharma AG, Berlin (Germany); Schmid, E, E-mail: Ernst.Schmid@lrz.uni-muenchen.d [Institute for Cell Biology, Center for Integrated Protein Science, University of Munich (Germany)

    2010-06-07

    The aim of this study was to investigate the dose response relationship of dicentrics in human lymphocytes after CT scans at tube voltages of 80 and 140 kV. Blood samples from a healthy donor placed in tissue equivalent abdomen phantoms of standard, pediatric and adipose sizes were exposed at dose levels up to 0.1 Gy using a 64-slice CT scanner. It was found that both the tube voltage and the phantom size significantly influenced the CT scan-induced linear dose-response relationship of dicentrics in human lymphocytes. Using the same phantom (standard abdomen), 80 kV CT x-rays were biologically more effective than 140 kV CT x-rays. However, it could also be determined that the applied phantom size had much more influence on the biological effectiveness. Obviously, the increasing slopes of the CT scan-induced dose response relationships of dicentrics in human lymphocytes obtained in a pediatric, a standard and an adipose abdomen have been induced by scattering effects of photons, which strongly increase with increasing phantom size.

  3. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  4. SCINFUL-QMD: Monte Carlo based computer code to calculate response function and detection efficiency of a liquid organic scintillator for neutron energies up to 3 GeV

    International Nuclear Information System (INIS)

    Satoh, Daiki; Sato, Tatsuhiko; Shigyo, Nobuhiro; Ishibashi, Kenji

    2006-11-01

    The Monte Carlo based computer code SCINFUL-QMD has been developed to evaluate response function and detection efficiency of a liquid organic scintillator for neutrons from 0.1 MeV to 3 GeV. This code is a modified version of SCINFUL that was developed at Oak Ridge National Laboratory in 1988, to provide a calculated full response anticipated for neutron interactions in a scintillator. The upper limit of the applicable energy was extended from 80 MeV to 3 GeV by introducing the quantum molecular dynamics incorporated with the statistical decay model (QMD+SDM) in the high-energy nuclear reaction part. The particles generated in QMD+SDM are neutron, proton, deuteron, triton, 3 He nucleus, alpha particle, and charged pion. Secondary reactions by neutron, proton, and pion inside the scintillator are also taken into account. With the extension of the applicable energy, the database of total cross sections for hydrogen and carbon nuclei were upgraded. This report describes the physical model, computational flow and how to use the code. (author)

  5. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  6. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  7. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Directory of Open Access Journals (Sweden)

    Deepali Bisht

    2013-01-01

    Full Text Available Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100. The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L-1; starch, 15.0 g.L-1; triton-X-100, 0.93 mL.L-1; incubation temperature, 34.12 ºC and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL-1. The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R² value (0.9987. The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization.

  8. Computation of interactive effects and optimization of process parameters for alkaline lipase production by mutant strain of Pseudomonas aeruginosa using response surface methodology

    Science.gov (United States)

    Bisht, Deepali; Yadav, Santosh Kumar; Darmwal, Nandan Singh

    2013-01-01

    Alkaline lipase production by mutant strain of Pseudomonas aeruginosa MTCC 10,055 was optimized in shake flask batch fermentation using response surface methodology. An empirical model was developed through Box-Behnken experimental design to describe the relationship among tested variables (pH, temperature, castor oil, starch and triton-X-100). The second-order quadratic model determined the optimum conditions as castor oil, 1.77 mL.L−1; starch, 15.0 g.L−1; triton-X-100, 0.93 mL.L−1; incubation temperature, 34.12 °C and pH 8.1 resulting into maximum alkaline lipase production (3142.57 U.mL−1). The quadratic model was in satisfactory adjustment with the experimental data as evidenced by a high coefficient of determination (R2) value (0.9987). The RSM facilitated the analysis and interpretation of experimental data to ascertain the optimum conditions of the variables for the process and recognized the contribution of individual variables to assess the response under optimal conditions. Hence Box-Behnken approach could fruitfully be applied for process optimization. PMID:24159311

  9. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism.

    Science.gov (United States)

    Mih, Nathan; Brunk, Elizabeth; Bordbar, Aarash; Palsson, Bernhard O

    2016-07-01

    Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase) and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants) in complex with their respective native metabolites or drug molecules. We find that changes in a protein's structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism.

  10. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism.

    Directory of Open Access Journals (Sweden)

    Nathan Mih

    2016-07-01

    Full Text Available Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants in complex with their respective native metabolites or drug molecules. We find that changes in a protein's structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism.

  11. Monte Carlo Study of the Effect of Collimator Thickness on T-99m Source Response in Single Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Islamian, Jalil Pirayesh; Toossi, Mohammad Taghi Bahreyni; Momennezhad, Mahdi; Zakavi, Seyyed Rasoul; Sadeghi, Ramin; Ljungberg, Michael

    2012-01-01

    In single photon emission computed tomography (SPECT), the collimator is a crucial element of the imaging chain and controls the noise resolution tradeoff of the collected data. The current study is an evaluation of the effects of different thicknesses of a low-energy high-resolution (LEHR) collimator on tomographic spatial resolution in SPECT. In the present study, the SIMIND Monte Carlo program was used to simulate a SPECT equipped with an LEHR collimator. A point source of 99m Tc and an acrylic cylindrical Jaszczak phantom, with cold spheres and rods, and a human anthropomorphic torso phantom (4D-NCAT phantom) were used. Simulated planar images and reconstructed tomographic images were evaluated both qualitatively and quantitatively. According to the tabulated calculated detector parameters, contribution of Compton scattering, photoelectric reactions, and also peak to Compton (P/C) area in the obtained energy spectrums (from scanning of the sources with 11 collimator thicknesses, ranging from 2.400 to 2.410 cm), we concluded the thickness of 2.405 cm as the proper LEHR parallel hole collimator thickness. The image quality analyses by structural similarity index (SSIM) algorithm and also by visual inspection showed suitable quality images obtained with a collimator thickness of 2.405 cm. There was a suitable quality and also performance parameters’ analysis results for the projections and reconstructed images prepared with a 2.405 cm LEHR collimator thickness compared with the other collimator thicknesses

  12. Intermediality between Games and Fiction: The “Ludology vs. Narratology” Debate in Computer Game Studies: A Response to Gonzalo Frasca

    Directory of Open Access Journals (Sweden)

    Kokonis Michalis

    2014-12-01

    Full Text Available In the last ten or fourteen years there has been a debate among the so called ludologists and narratologists in Computer Games Studies as to what is the best methodological approach for the academic study of electronic games. The aim of this paper is to propose a way out of the dilemma, suggesting that both ludology and narratology can be helpful methodologically. However, there is need for a wider theoretical perspective, that of semiotics, in which both approaches can be operative. The semiotic perspective proposed allows research in the field to focus on the similarities between games and traditional narrative forms (since they share narrativity to a greater or lesser extent as well as on their difference (they have different degrees of interaction; it will facilitate communication among theorists if we want to understand each other when talking about games and stories, and it will lead to a better understanding of the hybrid nature of the medium of game. In this sense the present paper aims to complement Gonzalo Frasca’s reconciliatory attempt made a few years back and expand on his proposal.

  13. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  14. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  16. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  17. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  18. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  19. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  20. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  1. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  2. Computer security engineering management

    International Nuclear Information System (INIS)

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system

  3. Synthesis and DPPH scavenging assay of reserpine analogues, computational studies and in silico docking studies in AChE and BChE responsible for Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    Muhammad Yar

    2015-03-01

    Full Text Available Alzheimer's disease (AD is a fast growing neurodegenerative disorder of the central nervous system and anti-oxidants can be used to help suppress the oxidative stress caused by the free radicals that are responsible for AD. A series of selected synthetic indole derivatives were biologically evaluated to identify potent new antioxidants. Most of the evaluated compounds showed significant to modest antioxidant properties (IC50 value 399.07 140.0±50 µM. Density Functional Theory (DFT studies were carried out on the compounds and their corresponding free radicals. Differences in the energy of the parent compounds and their corresponding free radicals provided a good justification for the trend found in their IC50 values. In silico, docking of compounds into the proteins acetylcholinesterase (AChE and butyrylcholinesterase (BChE, which are well known for contributing in AD disease, was also performed to predict anti-AD potential.

  4. Computers in radiology. The sedation, analgesia, and contrast media computerized simulator: a new approach to train and evaluate radiologists' responses to critical incidents

    International Nuclear Information System (INIS)

    Medina, L.S.; Racadio, J.M.; Schwid, H.A.

    2000-01-01

    Background. Awareness and preparedness to handle sedation, analgesia, and contrast-media complications are key in the daily radiology practice. Objective. The purpose is to create a computerized simulator (PC-Windows-based) that uses a graphical interface to reproduce critical incidents in pediatric and adult patients undergoing a wide spectrum of radiologic sedation, analgesia and contrast media complications. Materials and methods. The computerized simulator has a comprehensive set of physiologic and pharmacologic models that predict patient response to management of sedation, analgesia, and contrast-media complications. Photorealistic images, real-time monitors, and mouse-driven information demonstrate in a virtual-reality fashion the behavior of the patient in crisis. Results. Thirteen pediatric and adult radiology scenarios are illustrated encompassing areas such as pediatric radiology, neuroradiology, interventional radiology, and body imaging. The multiple case scenarios evaluate randomly the diagnostic and management performance of the radiologist in critical incidents such as oversedation, anaphylaxis, aspiration, airway obstruction, apnea, agitation, bronchospasm, hypotension, hypertension, cardiac arrest, bradycardia, tachycardia, and myocardial ischemia. The user must control the airway, breathing and circulation, and administer medications in a timely manner to save the simulated patient. On-line help is available in the program to suggest diagnostic and treatment steps to save the patient, and provide information about the medications. A printout of the case management can be obtained for evaluation or educational purposes. Conclusion. The interactive computerized simulator is a new approach to train and evaluate radiologists' responses to critical incidents encountered during radiologic sedation, analgesia, and contrast-media administration. (orig.)

  5. Analyses of insulin-potentiating fragments of human growth hormone by computative simulation; essential unit for insulin-involved biological responses.

    Science.gov (United States)

    Ohkura, K; Hori, H

    2000-07-01

    We analyzed the structural features of insulin-potentiating fragments of human growth hormone by computative simulations. The peptides were designated from the N-terminus sequences of the hormone positions at 1-15 (hGH(1-15); H2N-Phe1-Pro2-Thr3-Ile4-Pro5-Leu6-Ser7-Arg8-L eu9-Phe10-Asp11-Asn12-Ala13-Met14-Leu15 -COOH), 6-13 (hGH(6-13)), 7-13 (hGH(7-13)) and 8-13 (hGH(8-13)), which enhanced insulin-producing hypoglycemia. In these peptide molecules, ionic bonds were predicted to form between 8th-arginyl residue and 11th-aspartic residue, and this intramolecular interaction caused the formation of a macrocyclic structure containing a tetrapeptide Arg8-Leu9-Phe10-Asp11. The peptide positions at 6-10 (hGH(6-10)), 9-13 (hGH(9-13)) and 10-13 (hGH(10-13)) did not lead to a macrocyclic formation in the molecules, and had no effect on the insulin action. Although beta-Ala13hGH(1-15), in which the 13th-alanine was replaced by a beta-alanyl residue, had no effect on insulin-producing hypoglycemia, the macrocyclic region (Arg8-Leu9-Phe10-Asp11) was observed by the computative simulation. An isothermal vibration analysis of both of beta-Ala13hGH(1-15) and hGH(1-15) peptide suggested that beta-Ala13hGH(1-15) is molecule was more flexible than hGH(1-15); C-terminal carboxyl group of Leu15 easily accessed to Arg8 and inhibited the ionic bond formation between Arg8 and Asp11 in beta-Ala13hGH(1-15). The peptide of hGH(8-13) dose-dependently enhanced the insulin-involved fatty acid synthesis in rat white adipocytes, and stabilized the C6-NBD-PC (1-acyl-2-[6-[(7-nitro-2,1,3benzoxadiazol-4-yl)amino]-caproyl]-sn- glycero-3-phosphatidylcholine) model membranes. In contrast, hGH(9-13) had no effect both on the fatty acid synthesis and the membrane stability. In the same culture conditions as the fatty acid synthesis assay, hGH(8-13) had no effect on the transcript levels of glucose transporter isoforms (GLUT 1, 4) and hexokinase isozymes (HK I, II) in rat white adipocytes. Judging from

  6. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  7. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  8. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  9. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  10. Cone-beam computed tomography for lung cancer – validation with CT and monitoring tumour response during chemo-radiation therapy

    International Nuclear Information System (INIS)

    Michienzi, Alissa; Kron, Tomas; Callahan, Jason; Plumridge, Nikki; Ball, David; Everitt, Sarah

    2017-01-01

    Cone-beam computed tomography (CBCT) is a valuable image-guidance tool in radiation therapy (RT). This study was initiated to assess the accuracy of CBCT for quantifying non-small cell lung cancer (NSCLC) tumour volumes compared to the anatomical ‘gold standard’, CT. Tumour regression or progression on CBCT was also analysed. Patients with Stage I-III NSCLC, prescribed 60 Gy in 30 fractions RT with concurrent platinum-based chemotherapy, routine CBCT and enrolled in a prospective study of serial PET/CT (baseline, weeks two and four) were eligible. Time-matched CBCT and CT gross tumour volumes (GTVs) were manually delineated by a single observer on MIM software, and were analysed descriptively and using Pearson's correlation coefficient (r) and linear regression (R 2 ). Of 94 CT/CBCT pairs, 30 patients were eligible for inclusion. The mean (± SD) CT GTV vs CBCT GTV on the four time-matched pairs were 95 (±182) vs 98.8 (±160.3), 73.6 (±132.4) vs 70.7 (±96.6), 54.7 (±92.9) vs 61.0 (±98.8) and 61.3 (±53.3) vs 62.1 (±47.9) respectively. Pearson's correlation coefficient (r) was 0.98 (95% CI 0.97–0.99, ρ < 0.001). The mean (±SD) CT/CBCT Dice's similarity coefficient was 0.66 (±0.16). Of 289 CBCT scans, tumours in 27 (90%) patients regressed by a mean (±SD) rate of 1.5% (±0.75) per fraction. The mean (±SD) GTV regression was 43.1% (±23.1) from the first to final CBCT. Primary lung tumour volumes observed on CBCT and time-matched CT are highly correlated (although not identical), thereby validating observations of GTV regression on CBCT in NSCLC.

  11. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  12. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  13. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  14. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  15. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  16. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  17. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  18. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  19. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  20. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  1. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  2. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  3. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  4. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  5. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  6. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  7. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  8. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  9. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  10. Computational screening of Six Antigens for potential MHC class II restricted epitopes and evaluating its CD4+ T-Cell Responsiveness against Visceral Leishmaniasis

    Directory of Open Access Journals (Sweden)

    Manas Ranjan

    2017-12-01

    Full Text Available Visceral leishmaniasis is one of the most neglected tropical diseases for which no vaccine exists. In spite of extensive efforts, no successful vaccine is available against this dreadful infectious disease. To support the vaccine development, immunoinformatics approach was applied to search for potential MHC-classII restricted epitopes that can activate the immune cells. Initially, a total of 37 epitopes derived from six, stage dependent over expressed antigens were predicted, which were presented by at least 26 diverse MHC class II alleles including: DRB10101, DRB10301, DRB10401, DRB10404, DRB10405, DRB10701, DRB10802, DRB10901, DRB11101, DRB11302, DRB11501, DRB30101, DRB40101, DRB50101, DPA10103-DPB10401, DPA10103-DPB10201, DPA10201-DPB10101, DPA10103-DPB10301_DPB10401, DPA10301-DPB10402, DPA10201-DPB105021, DQA10102-DQB10602, DQA10401-DQB10402, DQA10501-QB10201, DQA10501-DQB10301, DQA10301-DQB10302 and DQA10101-DQB10501. Based on the population coverage analysis and HLA cross presentation ability, six epitopes namely, FDLFLFSNGAVVWWG (P1, YPVYPFLASNAALLN (P2, VYPFLASNAALLNLI (P3, LALLIMLYALIATQF (P4, LIMLYALIATQFSDD (P5, IMLYALIATQFSDDA (P6 were selected for further analysis. Stimulation with synthetic peptide alone or as a cocktail triggered the intracellular IFN-γ production. Moreover, specific IgG class of antibodies was detected in the serum of active VL cases against P1, P4, P and P6 in order to evaluate peptide effect on humoral immune response. Additionally, most of the peptides, except P2, were found to be non-inducer of CD4+ IL-10 against both active VL as well as treated VL subjects. Peptide immunogenicity was validated in BALB/c mice immunized with cocktail of synthetic peptide emulsified in complete Freund’s adjuvant/incomplete Freund’s adjuvant. The immunized splenocytes induced strong spleen cell proliferation upon parasite re-stimulation. Furthermore, an increased IFN-γ, IL-12, IL-17 and IL-22 production augmented with

  11. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  12. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  13. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  14. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  15. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  16. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  17. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  18. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  19. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  20. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  1. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  2. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  3. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  4. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  5. 90Y microsphere (TheraSphere) treatment for unresectable colorectal cancer metastases of the liver: response to treatment at targeted doses of 135-150 Gy as measured by [18F]fluorodeoxyglucose positron emission tomography and computed tomographic imaging.

    Science.gov (United States)

    Lewandowski, Robert J; Thurston, Kenneth G; Goin, James E; Wong, Ching-Yee O; Gates, Vanessa L; Van Buskirk, Mark; Geschwind, Jean-Francois H; Salem, Riad

    2005-12-01

    The purpose of this phase II study was to determine the safety and efficacy of TheraSphere treatment (90Y microspheres) in patients with liver-dominant colorectal metastases in whom standard therapies had failed or were judged to be inappropriate. Twenty-seven patients with unresectable hepatic colorectal metastases were treated at a targeted absorbed dose of 135-150 Gy. Safety and toxicity were assessed according to the National Cancer Institute's Common Toxicity Criteria, version 3.0. Response was assessed with use of computed tomography (CT) and was correlated with response on [18F]fluorodeoxyglucose (FDG) positron emission tomography (PET). Survival from first treatment was estimated with use of the Kaplan-Meier method. Tumor response measured by FDG PET imaging exceeded that measured by CT imaging for the first (88% vs 35%) and second (73% vs 36%) treated lobes. Tumor replacement of 25% or less (vs >25%) was associated with a statistically significant increase in median survival (339 days vs 162 days; P = .002). Treatment-related toxicities included mild fatigue (n = 13; 48%), nausea (n = 4; 15%), and vague abdominal pain (n = 5; 19%). There was one case of radiation-induced gastritis from inadvertent deposition of microspheres to the gastrointestinal tract (n = 1; 4%). Three patients (11%) experienced ascites/pleural effusion after treatment with TheraSphere as a consequence of liver failure in advanced-stage metastatic disease. With the exception of these three patients whose sequelae were not considered to be related to treatment, all observed toxicities were transient and resolved without medical intervention. TheraSphere administration appears to provide stabilization of liver disease with minimal toxicity in patients in whom standard systemic chemotherapy regimens have failed.

  6. Brain-computer interfaces

    DEFF Research Database (Denmark)

    Treder, Matthias S.; Miklody, Daniel; Blankertz, Benjamin

    quality measure'. We were able to show that for stimuli close to the perceptual threshold, there was sometimes a discrepancy between overt responses and brain responses, shedding light on subjects using different response criteria (e.g., more liberal or more conservative). To conclude, brain-computer...... of perceptual and cognitive biases. Furthermore, subjects can only report on stimuli if they have a clear percept of them. On the other hand, the electroencephalogram (EEG), the electrical brain activity measured with electrodes on the scalp, is a more direct measure. It allows us to tap into the ongoing neural...... auditory processing stream. In particular, it can tap brain processes that are pre-conscious or even unconscious, such as the earliest brain responses to sounds stimuli in primary auditory cortex. In a series of studies, we used a machine learning approach to show that the EEG can accurately reflect...

  7. A physics computing bureau

    CERN Document Server

    Laurikainen, P

    1975-01-01

    The author first reviews the services offered by the Bureau to the user community scattered over three separate physics departments and a theory research institute. Limited services are offered also to non- physics research in the University, in collaboration with the University Computing Center. The personnel is divided into operations sections responsible for the terminal and data archive management, punching and document services, etc. and into analysts sections with half a dozen full-time scientific programmers recruited among promising graduate level physics students, rather than computer scientists or mathematicians. Analysts are thus able not only to communicate with physicists but also to participate in research to some extent. Only more demanding program development tasks can be handled by the Bureau, most of the routine data processing is the users responsibility.

  8. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  9. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  10. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  11. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  12. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  13. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  14. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  15. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  16. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  17. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  18. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  19. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  20. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  1. Predictive ability of 18F-fluorodeoxyglucose positron emission tomography/computed tomography for pathological complete response and prognosis after neoadjuvant chemotherapy in triple-negative breast cancer patients

    Directory of Open Access Journals (Sweden)

    Sachiko Kiyoto

    2016-01-01

    Full Text Available Objective The mortality of patients with locally advanced triple-negative breast cancer (TNBC is high, and pathological complete response (pCR to neoadjuvant chemotherapy (NAC is associated with improved prognosis. This retrospective study was designed and powered to investigate the ability of 18F-fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT to predict pathological response to NAC and prognosis after NAC.Methods The data of 32 consecutive women with clinical stage II or III TNBC from January 2006 to December 2013 in our institution who underwent FDG-PET/CT at baseline and after NAC were retrospectively analyzed. The maximum standardized uptake value (SUVmax in the primary tumor at each examination and the change in SUVmax (ΔSUVmax between the two scans were measured. Correlations between PET parameters and pathological response, and correlations between PET parameters and disease-free survival (DFS were examined.Results At the completion of NAC, surgery showed pCR in 7 patients, while 25 had residual tumor, so-called non-pCR. Median follow-up was 39.0 months. Of the non-pCR patients, 9 relapsed at 3 years. Of all assessed clinical, biological, and PET parameters, N-stage, clinical stage, and ΔSUVmax were predictors of pathological response (p=0.0288, 0.0068, 0.0068; Fischer’s exact test. The cut-off value of ΔSUVmax to differentiate pCR evaluated by the receiver operating characteristic (ROC curve analysis was 81.3%. Three-year disease-free survival (DFS was lower in patients with non-pCR than in patients with pCR (p=0.328, log-rank test. The cut-off value of ΔSUVmax to differentiate 3-year DFS evaluated by the ROC analysis was 15.9%. In all cases, 3-year DFS was lower in patients with ΔSUVmax

  2. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  3. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  4. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  5. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  6. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  7. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  8. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  9. Privacy and legal issues in cloud computing

    CERN Document Server

    Weber, Rolf H

    2015-01-01

    Adopting a multi-disciplinary and comparative approach, this book focuses on emerging and innovative attempts to tackle privacy and legal issues in cloud computing, such as personal data privacy, security and intellectual property protection. Leading international academics and practitioners in the fields of law and computer science examine the specific legal implications of cloud computing pertaining to jurisdiction, biomedical practice and information ownership. This collection offers original and critical responses to the rising challenges posed by cloud computing.

  10. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  11. Quantum Computing and Second Quantization

    International Nuclear Information System (INIS)

    Makaruk, Hanna Ewa

    2017-01-01

    Quantum computers are by their nature many particle quantum systems. Both the many-particle arrangement and being quantum are necessary for the existence of the entangled states, which are responsible for the parallelism of the quantum computers. Second quantization is a very important approximate method of describing such systems. This lecture will present the general idea of the second quantization, and discuss shortly some of the most important formulations of second quantization.

  12. Fast computation of Krawtchouk moments

    Czech Academy of Sciences Publication Activity Database

    Honarvar Shakibaei Asli, B.; Flusser, Jan

    2014-01-01

    Roč. 288, č. 1 (2014), s. 73-86 ISSN 0020-0255 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Krawtchouk polynomial * Krawtchouk moment * Geometric moment * Impulse response * Fast computation * Digital filter Subject RIV: JD - Computer Applications, Robotics Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0432452.pdf

  13. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  14. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  15. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  16. The social impact of computers

    CERN Document Server

    Rosenberg, Richard S

    1992-01-01

    The Social Impact of Computers should be read as a guide to the social implications of current and future applications of computers. Among the basic themes presented are the following: the changing nature of work in response to technological innovation as well as the threat to jobs; personal freedom in the machine age as manifested by challenges to privacy, dignity, and work; the relationship between advances in computer and communications technology and the possibility of increased centralization of authority; and the emergence and influence of artificial intelligence and its role in decision

  17. Computer assisted procedure maintenance

    International Nuclear Information System (INIS)

    Bisio, R.; Hulsund, J. E.; Nilsen, S.

    2004-04-01

    The maintenance of operating procedures in a NPP is a tedious and complicated task. Through the whole life cycle of the procedures they will be dynamic, 'living' documents. Several aspects of the procedure must be considered in a revision process. Pertinent details and attributes of the procedure must be checked. An organizational structure must be created and responsibilities allotted for drafting, revising, reviewing and publishing procedures. Available powerful computer technology provides solutions within document management and computerisation of procedures. These solutions can also support the maintenance of procedures. Not all parts of the procedure life cycle are equally amenable to computerized support. This report looks at the procedure life cycle in todays NPPs and discusses the possibilities associated with introduction of computer technology to assist the maintenance of procedures. (Author)

  18. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  19. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  20. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  1. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  2. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  3. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  4. The Diffraction Response Interpolation Method

    DEFF Research Database (Denmark)

    Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Pedersen, Peder C.

    1998-01-01

    Computer modeling of the output voltage in a pulse-echo system is computationally very demanding, particularly whenconsidering reflector surfaces of arbitrary geometry. A new, efficient computational tool, the diffraction response interpolationmethod (DRIM), for modeling of reflectors in a fluid...... medium, is presented. The DRIM is based on the velocity potential impulseresponse method, adapted to pulse-echo applications by the use of acoustical reciprocity. Specifically, the DRIM operates bydividing the reflector surface into planar elements, finding the diffraction response at the corners...

  5. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  6. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  7. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  8. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  9. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  10. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  11. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  12. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  13. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  14. Distributed computing at the SSCL

    International Nuclear Information System (INIS)

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given

  15. Distributed computing at the SSCL

    International Nuclear Information System (INIS)

    Cormell, L.R.; White, R.C.

    1994-01-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory (SSCL). In addition, a brief review of the future directions of commercial products for distributed computing and management will be given

  16. A Revolution in Information Technology - Cloud Computing

    OpenAIRE

    Divya BHATT

    2012-01-01

    What is the Internet? It is collection of “interconnected networks” represented as a Cloud in network diagrams and Cloud Computing is a metaphor for certain parts of the Internet. The IT enterprises and individuals are searching for a way to reduce the cost of computation, storage and communication. Cloud Computing is an Internet-based technology providing “On-Demand” solutions for addressing these scenarios that should be flexible enough for adaptation and responsive to requirements. The hug...

  17. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  18. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  19. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  20. Computer-Aided Instruction in Automated Instrumentation.

    Science.gov (United States)

    Stephenson, David T.

    1986-01-01

    Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…

  1. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  2. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  3. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  4. Hyperswitch Network For Hypercube Computer

    Science.gov (United States)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  5. Responsibility and Responsiveness

    DEFF Research Database (Denmark)

    Nissen, Ulrik Becker

    2011-01-01

    The debate on the role and identity of Christian social ethics in liberal democracy touches upon the question about the relationship between universality and speci-ficity. Rather than argue for the difference between these approaches, it can be argued that they are to be understood in a different......The debate on the role and identity of Christian social ethics in liberal democracy touches upon the question about the relationship between universality and speci-ficity. Rather than argue for the difference between these approaches, it can be argued that they are to be understood...... contemporary positions of communicative ethics, H. Richard Niebuhr’s understanding of responsibility as responsiveness, and Dietrich Bonhoeffer’s Christological concept of responsibility in a constructive dialogue with each other, the article has attempted to outline main tenets of a responsive concept...

  6. Evaluation of a computer model to simulate water table response to subirrigation Avaliação de um modelo computacional para simular a resposta do lençol freático à subirrigação

    Directory of Open Access Journals (Sweden)

    Jadir Aparecido Rosa

    2002-12-01

    Full Text Available The objective of this work was to evaluate the water flow computer model, WATABLE, using experimental field observations on water table management plots from a site located near Hastings, FL, USA. The experimental field had scale drainage systems with provisions for subirrigation with buried microirrigation and conventional seepage irrigation systems. Potato (Solanum tuberosum L. growing seasons from years 1996 and 1997 were used to simulate the hydrology of the area. Water table levels, precipitation, irrigation and runoff volumes were continuously monitored. The model simulated the water movement from a buried microirrigation line source and the response of the water table to irrigation, precipitation, evapotranspiration, and deep percolation. The model was calibrated and verified by comparing simulated results with experimental field observations. The model performed very well in simulating seasonal runoff, irrigation volumes, and water table levels during crop growth. The two-dimensional model can be used to investigate different irrigation strategies involving water table management control. Applications of the model include optimization of the water table depth for each growth stage, and duration, frequency, and rate of irrigation.O objetivo deste trabalho foi avaliar o modelo computacional WATABLE usando-se dados de campo obtidos em uma área experimental em manejo de lençol freático, localizada em Hastings, FL, EUA. Na área experimental, estavam instalados um sistema de drenagem e sistemas de irrigação por subsuperfície com irrigação localizada e por canais. Ciclos de cultivo de batata (Solanum tuberosum L., nos anos de 1996 e 1997, foram usados para a simulação da hidrologia da área. Profundidades do lençol freático, chuvas, irrigação e escorrimento superficial foram monitorados constantemente. O modelo simulou o movimento da água a partir de uma linha de irrigação localizada enterrada, e a resposta do nível do len

  7. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  8. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  9. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  10. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  11. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  12. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  13. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  14. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  15. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  16. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  17. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  18. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  19. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  20. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  1. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  2. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  3. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  4. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  5. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  6. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  7. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  8. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  9. Experimental quantum computing without entanglement.

    Science.gov (United States)

    Lanyon, B P; Barbieri, M; Almeida, M P; White, A G

    2008-11-14

    Deterministic quantum computation with one pure qubit (DQC1) is an efficient model of computation that uses highly mixed states. Unlike pure-state models, its power is not derived from the generation of a large amount of entanglement. Instead it has been proposed that other nonclassical correlations are responsible for the computational speedup, and that these can be captured by the quantum discord. In this Letter we implement DQC1 in an all-optical architecture, and experimentally observe the generated correlations. We find no entanglement, but large amounts of quantum discord-except in three cases where an efficient classical simulation is always possible. Our results show that even fully separable, highly mixed, states can contain intrinsically quantum mechanical correlations and that these could offer a valuable resource for quantum information technologies.

  10. SCINFUL: A Monte Carlo based computer program to determine a scintillator full energy response to neutron detection for E/sub n/ between 0.1 and 80 MeV: Program development and comparisons of program predictions with experimental data

    International Nuclear Information System (INIS)

    Dickens, J.K.

    1988-04-01

    This document provides a discussion of the development of the FORTRAN Monte Carlo program SCINFUL (for scintillator full response), a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The program may also be used to compute angle-integrated spectra of charged particles (p, d, t, 3 He, and α) following neutron interactions with 12 C. Extensive comparisons with a variety of experimental data are given. There is generally overall good agreement ( 15% of the maximum pulse height response, calculated spectra are within +-5% of experiment on the average. For E/sub n/ up to 50 MeV similar good agreement is obtained with experiment for E/sub r/ > 30% of maximum response. For E/sub n/ up to 75 MeV the calculated shape of the response agrees with measurements, but the calculations underpredicts the measured response by up to 30%. 65 refs., 64 figs., 3 tabs

  11. Optically Controlled Quantum Dot Spins for Scaleable Quantum Computing

    National Research Council Canada - National Science Library

    Steel, Duncan G

    2006-01-01

    .... Sham is responsible for theoretical support & concept development. The group at Michigan along with this QuaCGR student are responsible for experimental demonstration of key experimental demonstrations for quantum computing...

  12. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  13. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  14. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  15. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  16. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  17. Electrodermal Response in Gaming

    Directory of Open Access Journals (Sweden)

    J. Christopher Westland

    2011-01-01

    Full Text Available Steady improvements in technologies that measure human emotional response offer new possibilities for making computer games more immersive. This paper reviews the history of designs a particular branch of affective technologies that acquire electrodermal response readings from human subjects. Electrodermal response meters have gone through continual improvements to better measure these nervous responses, but still fall short of the capabilities of today's technology. Electrodermal response traditionally have been labor intensive. Protocols and transcription of subject responses were recorded on separate documents, forcing constant shifts of attention between scripts, electrodermal measuring devices and of observations and subject responses. These problems can be resolved by collecting more information and integrating it in a computer interface that is, by adding relevant sensors in addition to the basic electrodermal resistance reading to untangle (1 body resistance; (2 skin resistance; (3 grip movements; other (4 factors affecting the neural processing for regulation of the body. A device that solves these problems is presented and discussed. It is argued that the electrodermal response datastreams can be enriched through the use of added sensors and a digital acquisition and processing of information, which should further experimentation and use of the technology.

  18. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  19. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  20. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  1. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  2. SICOEM: emergency response data system

    International Nuclear Information System (INIS)

    Martin, A.; Villota, C.; Francia, L.

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig

  3. SICOEM: emergency response data system

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.; Villota, C.; Francia, L. (UNESA, Madrid (Spain))

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig.

  4. Voice Response Systems Technology.

    Science.gov (United States)

    Gerald, Jeanette

    1984-01-01

    Examines two methods of generating synthetic speech in voice response systems, which allow computers to communicate in human terms (speech), using human interface devices (ears): phoneme and reconstructed voice systems. Considerations prior to implementation, current and potential applications, glossary, directory, and introduction to Input Output…

  5. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  6. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  7. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  8. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  9. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  10. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  11. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  12. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  13. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  14. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  15. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  17. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  18. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  19. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  20. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  1. Computer Use by School Teachers in Teaching-Learning Process

    Science.gov (United States)

    Bhalla, Jyoti

    2013-01-01

    Developing countries have a responsibility not merely to provide computers for schools, but also to foster a habit of infusing a variety of ways in which computers can be integrated in teaching-learning amongst the end users of these tools. Earlier researches lacked a systematic study of the manner and the extent of computer-use by teachers. The…

  2. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  3. Computer Games for the Math Achievement of Diverse Students

    Science.gov (United States)

    Kim, Sunha; Chang, Mido

    2010-01-01

    Although computer games as a way to improve students' learning have received attention by many educational researchers, no consensus has been reached on the effects of computer games on student achievement. Moreover, there is lack of empirical research on differential effects of computer games on diverse learners. In response, this study…

  4. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  5. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  6. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  7. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  8. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  9. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  10. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  11. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  12. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  13. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  14. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  15. Corporate Responsibility

    DEFF Research Database (Denmark)

    Waddock, Sandra; Rasche, Andreas

    2015-01-01

    We define and discuss the concept of corporate responsibility. We suggest that corporate responsibility has some unique characteristics, which makes it different from earlier conceptions of corporate social responsibility. Our discussion further shows commonalities and differences between corporate...... responsibility and related concepts, such as corporate citizenship and business ethics. We also outline some ways in which corporations have implemented corporate responsibility in practice....

  16. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  17. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  18. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  19. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  20. Searching with Quantum Computers

    OpenAIRE

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.