WorldWideScience

Sample records for computing spatialimpulse responses

  1. Computer incident response and forensics team management conducting a successful incident response

    CERN Document Server

    Johnson, Leighton

    2013-01-01

    Computer Incident Response and Forensics Team Management provides security professionals with a complete handbook of computer incident response from the perspective of forensics team management. This unique approach teaches readers the concepts and principles they need to conduct a successful incident response investigation, ensuring that proven policies and procedures are established and followed by all team members. Leighton R. Johnson III describes the processes within an incident response event and shows the crucial importance of skillful forensics team management, including when and where the transition to forensics investigation should occur during an incident response event. The book also provides discussions of key incident response components. Provides readers with a complete handbook on computer incident response from the perspective of forensics team management Identify the key steps to completing a successful computer incident response investigation Defines the qualities necessary to become a succ...

  2. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  3. Stimulus-response compatibility and affective computing: A review

    NARCIS (Netherlands)

    Lemmens, P.M.C.; Haan, A. de; Galen, G.P. van; Meulenbroek, R.G.J.

    2007-01-01

    Affective computing, a human–factors effort to investigate the merits of emotions while people are working with human–computer interfaces, is gaining momentum. Measures to quantify affect (or its influences) range from EEG, to measurements of autonomic–nervous–system responses (e.g., heart rate,

  4. Ethics and computing living responsibly in a computerized world

    CERN Document Server

    2001-01-01

    "Ethics and Computing, Second Edition promotes awareness of major issues and accepted procedures and policies in the area of ethics and computing using real-world companies, incidents, products and people." "Ethics and Computing, Second Edition is for topical undergraduate courses with chapters and assignments designed to encourage critical thinking and informed ethical decisions. Furthermore, this book will keep abreast computer science, computer engineering, and information systems professionals and their colleagues of current ethical issues and responsibilities."--Jacket.

  5. A response-modeling alternative to surrogate models for support in computational analyses

    International Nuclear Information System (INIS)

    Rutherford, Brian

    2006-01-01

    Often, the objectives in a computational analysis involve characterization of system performance based on some function of the computed response. In general, this characterization includes (at least) an estimate or prediction for some performance measure and an estimate of the associated uncertainty. Surrogate models can be used to approximate the response in regions where simulations were not performed. For most surrogate modeling approaches, however (1) estimates are based on smoothing of available data and (2) uncertainty in the response is specified in a point-wise (in the input space) fashion. These aspects of the surrogate model construction might limit their capabilities. One alternative is to construct a probability measure, G(r), for the computer response, r, based on available data. This 'response-modeling' approach will permit probability estimation for an arbitrary event, E(r), based on the computer response. In this general setting, event probabilities can be computed: prob(E)=∫ r I(E(r))dG(r) where I is the indicator function. Furthermore, one can use G(r) to calculate an induced distribution on a performance measure, pm. For prediction problems where the performance measure is a scalar, its distribution F pm is determined by: F pm (z)=∫ r I(pm(r)≤z)dG(r). We introduce response models for scalar computer output and then generalize the approach to more complicated responses that utilize multiple response models

  6. Computer security incident response team effectiveness : A needs assessment

    NARCIS (Netherlands)

    Kleij, R. van der; Kleinhuis, G.; Young, H.J.

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad-hoc basis, in close cooperation with other teams, and in

  7. Analytical predictions of SGEMP response and comparisons with computer calculations

    International Nuclear Information System (INIS)

    de Plomb, E.P.

    1976-01-01

    An analytical formulation for the prediction of SGEMP surface current response is presented. Only two independent dimensionless parameters are required to predict the peak magnitude and rise time of SGEMP induced surface currents. The analysis applies to limited (high fluence) emission as well as unlimited (low fluence) emission. Cause-effect relationships for SGEMP response are treated quantitatively, and yield simple power law dependencies between several physical variables. Analytical predictions for a large matrix of SGEMP cases are compared with an array of about thirty-five computer solutions of similar SGEMP problems, which were collected from three independent research groups. The theoretical solutions generally agree with the computer solutions as well as the computer solutions agree with one another. Such comparisons typically show variations less than a ''factor of two.''

  8. Prerequisites for building a computer security incident response capability

    CSIR Research Space (South Africa)

    Mooi, M

    2015-08-01

    Full Text Available . 1]. 2) Handbook for Computer Security Incident Response Teams (CSIRTs) [18] (CMU-SEI): Providing guidance on building and running a CSIRT, this handbook has a particular focus on the incident handling service [18, p. xv]. In addition, a basic CSIRT... stream_source_info Mooi_2015.pdf.txt stream_content_type text/plain stream_size 41092 Content-Encoding UTF-8 stream_name Mooi_2015.pdf.txt Content-Type text/plain; charset=UTF-8 Prerequisites for building a computer...

  9. Computations of nuclear response functions with MACK-IV

    International Nuclear Information System (INIS)

    Abdou, M.A.; Gohar, Y.

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications

  10. Computations of nuclear response functions with MACK-IV

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M A; Gohar, Y

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications.

  11. Computed tomography assessment of early response to neoadjuvant therapy in colon cancer

    DEFF Research Database (Denmark)

    Dam, Claus; Lund-Rasmussen, Vera; Pløen, John

    2015-01-01

    INTRODUCTION: Using multidetector computed tomography, we aimed to assess the early response of neoadjuvant drug therapy for locally advanced colon cancer. METHODS: Computed tomography with IV contrast was acquired from 67 patients before and after up to three cycles of preoperative treatment. All...

  12. New computational method for non-LTE, the linear response matrix

    International Nuclear Information System (INIS)

    Fournier, K.B.; Grasiani, F.R.; Harte, J.A.; Libby, S.B.; More, R.M.; Zimmerman, G.B.

    1998-01-01

    My coauthors have done extensive theoretical and computational calculations that lay the ground work for a linear response matrix method to calculate non-LTE (local thermodynamic equilibrium) opacities. I will give briefly review some of their work and list references. Then I will describe what has been done to utilize this theory to create a computational package to rapidly calculate mild non-LTE emission and absorption opacities suitable for use in hydrodynamic calculations. The opacities are obtained by performing table look-ups on data that has been generated with a non-LTE package. This scheme is currently under development. We can see that it offers a significant computational speed advantage. It is suitable for mild non-LTE, quasi-steady conditions. And it offers a new insertion path for high-quality non-LTE data. Currently, the linear response matrix data file is created using XSN. These data files could be generated by more detailed and rigorous calculations without changing any part of the implementation in the hydro code. The scheme is running in Lasnex and is being tested and developed

  13. Visual and psychological stress during computer work in healthy, young females-physiological responses.

    Science.gov (United States)

    Mork, Randi; Falkenberg, Helle K; Fostervold, Knut Inge; Thorud, Hanne Mari S

    2018-05-30

    Among computer workers, visual complaints, and neck pain are highly prevalent. This study explores how occupational simulated stressors during computer work, like glare and psychosocial stress, affect physiological responses in young females with normal vision. The study was a within-subject laboratory experiment with a counterbalanced, repeated design. Forty-three females performed four 10-min computer-work sessions with different stress exposures: (1) minimal stress; (2) visual stress (direct glare); (3) psychological stress; and (4) combined visual and psychological stress. Muscle activity and muscle blood flow in trapezius, muscle blood flow in orbicularis oculi, heart rate, blood pressure, blink rate and postural angles were continuously recorded. Immediately after each computer-work session, fixation disparity was measured and a questionnaire regarding perceived workstation lighting and stress was completed. Exposure to direct glare resulted in increased trapezius muscle blood flow, increased blink rate, and forward bending of the head. Psychological stress induced a transient increase in trapezius muscle activity and a more forward-bent posture. Bending forward towards the computer screen was correlated with higher productivity (reading speed), indicating a concentration or stress response. Forward bent posture was also associated with changes in fixation disparity. Furthermore, during computer work per se, trapezius muscle activity and blood flow, orbicularis oculi muscle blood flow, and heart rate were increased compared to rest. Exposure to glare and psychological stress during computer work were shown to influence the trapezius muscle, posture, and blink rate in young, healthy females with normal binocular vision, but in different ways. Accordingly, both visual and psychological factors must be taken into account when optimizing computer workstations to reduce physiological responses that may cause excessive eyestrain and musculoskeletal load.

  14. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    Science.gov (United States)

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  15. Computational mechanics of nonlinear response of shells

    Energy Technology Data Exchange (ETDEWEB)

    Kraetzig, W.B. (Bochum Univ. (Germany, F.R.). Inst. fuer Statik und Dynamik); Onate, E. (Universidad Politecnica de Cataluna, Barcelona (Spain). Escuela Tecnica Superior de Ingenieros de Caminos) (eds.)

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs.

  16. Computational mechanics of nonlinear response of shells

    International Nuclear Information System (INIS)

    Kraetzig, W.B.; Onate, E.

    1990-01-01

    Shell structures and their components are utilized in a wide spectrum of engineering fields reaching from space and aircraft structures, pipes and pressure vessels over liquid storage tanks, off-shore installations, cooling towers and domes, to bodyworks of motor vehicles. Of continuously increasing importance is their nonlinear behavior, in which large deformations and large rotations are involved as well as nonlinear material properties. The book starts with a survey about nonlinear shell theories from the rigorous point of view of continuum mechanics, this starting point being unavoidable for modern computational concepts. There follows a series of papers on nonlinear, especially unstable shell responses, which draw computational connections to well established tools in the field of static and dynamic stability of systems. Several papers are then concerned with new finite element derivations for nonlinear shell problems, and finally a series of authors contribute to specific applications opening a small window of the above mentioned wide spectrum. (orig./HP) With 159 figs

  17. Evaluating and tuning system response in the MFTF-B control and diagnostics computers

    International Nuclear Information System (INIS)

    Palasek, R.L.; Butner, D.N.; Minor, E.G.

    1983-01-01

    The software system running on the Supervisory Control and Diagnostics System (SCDS) of MFTF-B is, for the major part, an event driven one. Regular, periodic polling of sensors' outputs takes place only at the local level, in the sensors' corresponding local control microcomputers (LCC's). An LCC reports a sensor's value to the supervisory computer only if there was a significant change. This report is passed as a message, routed among and acted upon by a network of applications and systems tasks within the supervisory computer (SCDS). Commands from the operator's console are similarly routed through a network of tasks, but in the oppostie direction to the experiment's hardware. In a network such as this, response time is partialy determined by system traffic. Because the hardware of MFTF-B will not be connected to the computer system for another two years, we are using the local control computers to simulate the event driven traffic that we expect to see during MFTF-B operation. In this paper we show how we are using the simulator to measure and evaluate response, loading, throughput, and utilization of components within the computer system. Measurement of the system under simulation allows us to identify bottlenecks and verify their unloosening. We also use the traffic simulators to evaluate prototypes of different algorithms for selected tasks, comparing their responses under the spectrum of traffic intensities

  18. Therapy response evaluation with positron emission tomography-computed tomography.

    Science.gov (United States)

    Segall, George M

    2010-12-01

    Positron emission tomography-computed tomography with F-18-fluorodeoxyglucose is widely used for evaluation of therapy response in patients with solid tumors but has not been as readily adopted in clinical trials because of the variability of acquisition and processing protocols and the absence of universal response criteria. Criteria proposed for clinical trials are difficult to apply in clinical practice, and gestalt impression is probably accurate in individual patients, especially with respect to the presence of progressive disease and complete response. Semiquantitative methods of determining tissue glucose metabolism, such as standard uptake value, can be a useful descriptor for levels of tissue glucose metabolism and changes in response to therapy if technical quality control measures are carefully maintained. The terms partial response, complete response, and progressive disease are best used in clinical trials in which the terms have specific meanings and precise definitions. In clinical practice, it may be better to use descriptive terminology agreed upon by imaging physicians and clinicians in their own practice. Copyright © 2010. Published by Elsevier Inc.

  19. Computer Security Incident Response Team Effectiveness: A Needs Assessment.

    Science.gov (United States)

    Van der Kleij, Rick; Kleinhuis, Geert; Young, Heather

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  20. Computer Security Incident Response Team Effectiveness: A Needs Assessment

    Directory of Open Access Journals (Sweden)

    Rick Van der Kleij

    2017-12-01

    Full Text Available Computer security incident response teams (CSIRTs respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  1. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  2. Implementation of distributed computing system for emergency response and contaminant spill monitoring

    International Nuclear Information System (INIS)

    Ojo, T.O.; Sterling, M.C.Jr.; Bonner, J.S.; Fuller, C.B.; Kelly, F.; Page, C.A.

    2003-01-01

    The availability and use of real-time environmental data greatly enhances emergency response and spill monitoring in coastal and near shore environments. The data would include surface currents, wind speed, wind direction, and temperature. Model predictions (fate and transport) or forensics can also be included. In order to achieve an integrated system suitable for application in spill or emergency response situations, a link is required because this information exists on many different computing platforms. When real-time measurements are needed to monitor a spill, the use of a wide array of sensors and ship-based post-processing methods help reduce the latency in data transfer between field sampling stations and the Incident Command Centre. The common thread linking all these modules is the Transmission Control Protocol/Internet Protocol (TCP/IP), and the result is an integrated distributed computing system (DCS). The in-situ sensors are linked to an onboard computer through the use of a ship-based local area network (LAN) using a submersible device server. The onboard computer serves as both the data post-processor and communications server. It links the field sampling station with other modules, and is responsible for transferring data to the Incident Command Centre. This link is facilitated by a wide area network (WAN) based on wireless broadband communications facilities. This paper described the implementation of the DCS. The test results for the communications link and system readiness were also included. 6 refs., 2 tabs., 3 figs

  3. Computational methods for describing the laser-induced mechanical response of tissue

    Energy Technology Data Exchange (ETDEWEB)

    Trucano, T.; McGlaun, J.M.; Farnsworth, A.

    1994-02-01

    Detailed computational modeling of laser surgery requires treatment of the photoablation of human tissue by high intensity pulses of laser light and the subsequent thermomechanical response of the tissue. Three distinct physical regimes must be considered to accomplish this: (1) the immediate absorption of the laser pulse by the tissue and following tissue ablation, which is dependent upon tissue light absorption characteristics; (2) the near field thermal and mechanical response of the tissue to this laser pulse, and (3) the potential far field (and longer time) mechanical response of witness tissue. Both (2) and (3) are dependent upon accurate constitutive descriptions of the tissue. We will briefly review tissue absorptivity and mechanical behavior, with an emphasis on dynamic loads characteristic of the photoablation process. In this paper our focus will center on the requirements of numerical modeling and the uncertainties of mechanical tissue behavior under photoablation. We will also discuss potential contributions that computational simulations can make in the design of surgical protocols which utilize lasers, for example, in assessing the potential for collateral mechanical damage by laser pulses.

  4. Seismic response computations for a long span bridge

    International Nuclear Information System (INIS)

    McCallen, D.B.

    1994-01-01

    The authors are performing large-scale numerical computations to simulate the earthquake response of a major long-span bridge that crosses the San Francisco Bay. The overall objective of the study is to estimate the response of the bridge to potential large-magnitude earthquakes generated on the nearby San Andreas and Hayward earthquake faults. Generation of a realistic model of the bridge system is complicated by the existence of large pile group foundations that extend deep into soft, saturated clay soils, and by the numerous expansion joints that segment the overall bridge structure. In the current study, advanced, nonlinear, finite element technology is being applied to rigorously model the detailed behavior of the bridge system and to shed light on the influence of the foundations and joints of the bridge

  5. COMPUTATIONAL MODELING OF SIGNALING PATHWAYS MEDIATING CELL CYCLE AND APOPTOTIC RESPONSES TO IONIZING RADIATION MEDIATED DNA DAMAGE

    Science.gov (United States)

    Demonstrated of the use of a computational systems biology approach to model dose response relationships. Also discussed how the biologically motivated dose response models have only limited reference to the underlying molecular level. Discussed the integration of Computational S...

  6. In Law We Trust? Trusted Computing and Legal Responsibility for Internet Security

    Science.gov (United States)

    Danidou, Yianna; Schafer, Burkhard

    This paper analyses potential legal responses and consequences to the anticipated roll out of Trusted Computing (TC). It is argued that TC constitutes such a dramatic shift in power away from users to the software providers, that it is necessary for the legal system to respond. A possible response is to mirror the shift in power by a shift in legal responsibility, creating new legal liabilities and duties for software companies as the new guardians of internet security.

  7. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  8. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  9. Towards SSVEP-based, portable, responsive Brain-Computer Interface.

    Science.gov (United States)

    Kaczmarek, Piotr; Salomon, Pawel

    2015-08-01

    A Brain-Computer Interface in motion control application requires high system responsiveness and accuracy. SSVEP interface consisted of 2-8 stimuli and 2 channel EEG amplifier was presented in this paper. The observed stimulus is recognized based on a canonical correlation calculated in 1 second window, ensuring high interface responsiveness. A threshold classifier with hysteresis (T-H) was proposed for recognition purposes. Obtained results suggest that T-H classifier enables to significantly increase classifier performance (resulting in accuracy of 76%, while maintaining average false positive detection rate of stimulus different then observed one between 2-13%, depending on stimulus frequency). It was shown that the parameters of T-H classifier, maximizing true positive rate, can be estimated by gradient-based search since the single maximum was observed. Moreover the preliminary results, performed on a test group (N=4), suggest that for T-H classifier exists a certain set of parameters for which the system accuracy is similar to accuracy obtained for user-trained classifier.

  10. Computational Fluid Dynamics Simulation of Combustion Instability in Solid Rocket Motor : Implementation of Pressure Coupled Response Function

    OpenAIRE

    S. Saha; D. Chakraborty

    2016-01-01

    Combustion instability in solid propellant rocket motor is numerically simulated by implementing propellant response function with quasi steady homogeneous one dimensional formulation. The convolution integral of propellant response with pressure history is implemented through a user defined function in commercial computational fluid dynamics software. The methodology is validated against literature reported motor test and other simulation results. Computed amplitude of pressure fluctuations ...

  11. Peer-Allocated Instant Response (PAIR): Computional allocation of peer tutors in learning communities

    NARCIS (Netherlands)

    Westera, Wim

    2009-01-01

    Westera, W. (2007). Peer-Allocated Instant Response (PAIR): Computational allocation of peer tutors in learning communities. Journal of Artificial Societies and Social Simulation, http://jasss.soc.surrey.ac.uk/10/2/5.html

  12. Ethical Responsibility Key to Computer Security.

    Science.gov (United States)

    Lynn, M. Stuart

    1989-01-01

    The pervasiveness of powerful computers and computer networks has raised the specter of new forms of abuse and of concomitant ethical issues. Blurred boundaries, hackers, the Computer Worm, ethical issues, and implications for academic institutions are discussed. (MLW)

  13. Ark of Inquiry: Responsible Research and Innovation through Computer-Based Inquiry Learning

    NARCIS (Netherlands)

    Margus Pedaste; Leo Siiman; Bregje de Vries; Mirjam Burget; Tomi Jaakkola; Emanuele Bardone; Meelis Brikker; Mario Mäeots; Marianne Lind; Koen Veermans

    2015-01-01

    Ark of Inquiry is a learning platform that uses a computer-based inquiry learning approach to raise youth awareness to Responsible Research and Innovation (RRI). It is developed in the context of a large-scale European project (http://www.arkofinquiry.eu) and provides young European citizens

  14. Computational biomechanics of bone's responses to dental prostheses - osseointegration, remodeling and resorption

    International Nuclear Information System (INIS)

    Li Wei; Rungsiyakull, Chaiy; Field, Clarice; Lin, Daniel; Zhang Leo; Li Qing; Swain, Michael

    2010-01-01

    Clinical and experimental studies showed that human bone has the ability to remodel itself to better adapt to its biomechanical environment by changing both its material properties and geometry. As a consequence of the rapid development and extensive applications of major dental restorations such as implantation and fixed partial denture (FPD), the effect of bone remodeling on the success of a dental restorative surgery is becoming critical for prosthetic design and pre-surgical assessment. This paper aims to provide a computational biomechanics framework to address dental bone's responses as a result of dental restoration. It explored three important issues of resorption, apposition and osseointegration in terms of remodeling simulation. The published remodeling data in long bones were regulated to drive the computational remodeling prediction for the dental bones by correlating the results to clinical data. It is anticipated that the study will provide a more predictive model of dental bone response and help develop a new design methodology for patient-specific dental prosthetic restoration.

  15. A discrete ordinate response matrix method for massively parallel computers

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1991-01-01

    A discrete ordinate response matrix method is formulated for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices which result from the diamond-differenced equations are utilized in a factored form which minimizes memory requirements and significantly reduces the required number of algorithm utilizes massive parallelism by assigning each spatial node to a processor. The algorithm is accelerated effectively by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red/black iterations. The method has been implemented on a 16k Connection Machine-2, and S 8 and S 16 solutions have been obtained for fixed-source benchmark problems in X--Y geometry

  16. Psychophysiological Assessment Of Fear Experience In Response To Sound During Computer Video Gameplay

    DEFF Research Database (Denmark)

    Garner, Tom Alexander; Grimshaw, Mark

    2013-01-01

    The potential value of a looping biometric feedback system as a key component of adaptive computer video games is significant. Psychophysiological measures are essential to the development of an automated emotion recognition program, capable of interpreting physiological data into models of affect...... and systematically altering the game environment in response. This article presents empirical data the analysis of which advocates electrodermal activity and electromyography as suitable physiological measures to work effectively within a computer video game-based biometric feedback loop, within which sound...

  17. Predictive computational modeling of the mucosal immune responses during Helicobacter pylori infection.

    Directory of Open Access Journals (Sweden)

    Adria Carbo

    Full Text Available T helper (Th cells play a major role in the immune response and pathology at the gastric mucosa during Helicobacter pylori infection. There is a limited mechanistic understanding regarding the contributions of CD4+ T cell subsets to gastritis development during H. pylori colonization. We used two computational approaches: ordinary differential equation (ODE-based and agent-based modeling (ABM to study the mechanisms underlying cellular immune responses to H. pylori and how CD4+ T cell subsets influenced initiation, progression and outcome of disease. To calibrate the model, in vivo experimentation was performed by infecting C57BL/6 mice intragastrically with H. pylori and assaying immune cell subsets in the stomach and gastric lymph nodes (GLN on days 0, 7, 14, 30 and 60 post-infection. Our computational model reproduced the dynamics of effector and regulatory pathways in the gastric lamina propria (LP in silico. Simulation results show the induction of a Th17 response and a dominant Th1 response, together with a regulatory response characterized by high levels of mucosal Treg cells. We also investigated the potential role of peroxisome proliferator-activated receptor γ (PPARγ activation on the modulation of host responses to H. pylori by using loss-of-function approaches. Specifically, in silico results showed a predominance of Th1 and Th17 cells in the stomach of the cell-specific PPARγ knockout system when compared to the wild-type simulation. Spatio-temporal, object-oriented ABM approaches suggested similar dynamics in induction of host responses showing analogous T cell distributions to ODE modeling and facilitated tracking lesion formation. In addition, sensitivity analysis predicted a crucial contribution of Th1 and Th17 effector responses as mediators of histopathological changes in the gastric mucosa during chronic stages of infection, which were experimentally validated in mice. These integrated immunoinformatics approaches

  18. Splitting method for computing coupled hydrodynamic and structural response

    International Nuclear Information System (INIS)

    Ash, J.E.

    1977-01-01

    A numerical method is developed for application to unsteady fluid dynamics problems, in particular to the mechanics following a sudden release of high energy. Solution of the initial compressible flow phase provides input to a power-series method for the incompressible fluid motions. The system is split into spatial and time domains leading to the convergent computation of a sequence of elliptic equations. Two sample problems are solved, the first involving an underwater explosion and the second the response of a nuclear reactor containment shell structure to a hypothetical core accident. The solutions are correlated with experimental data

  19. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    International Nuclear Information System (INIS)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin; Hollingsworth, Alan B.; Qian, Wei

    2015-01-01

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy

  20. Computer-aided breast MR image feature analysis for prediction of tumor response to chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Aghaei, Faranak; Tan, Maxine; Liu, Hong; Zheng, Bin, E-mail: Bin.Zheng-1@ou.edu [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Hollingsworth, Alan B. [Mercy Women’s Center, Mercy Health Center, Oklahoma City, Oklahoma 73120 (United States); Qian, Wei [Department of Electrical and Computer Engineering, University of Texas, El Paso, Texas 79968 (United States)

    2015-11-15

    Purpose: To identify a new clinical marker based on quantitative kinetic image features analysis and assess its feasibility to predict tumor response to neoadjuvant chemotherapy. Methods: The authors assembled a dataset involving breast MR images acquired from 68 cancer patients before undergoing neoadjuvant chemotherapy. Among them, 25 patients had complete response (CR) and 43 had partial and nonresponse (NR) to chemotherapy based on the response evaluation criteria in solid tumors. The authors developed a computer-aided detection scheme to segment breast areas and tumors depicted on the breast MR images and computed a total of 39 kinetic image features from both tumor and background parenchymal enhancement regions. The authors then applied and tested two approaches to classify between CR and NR cases. The first one analyzed each individual feature and applied a simple feature fusion method that combines classification results from multiple features. The second approach tested an attribute selected classifier that integrates an artificial neural network (ANN) with a wrapper subset evaluator, which was optimized using a leave-one-case-out validation method. Results: In the pool of 39 features, 10 yielded relatively higher classification performance with the areas under receiver operating characteristic curves (AUCs) ranging from 0.61 to 0.78 to classify between CR and NR cases. Using a feature fusion method, the maximum AUC = 0.85 ± 0.05. Using the ANN-based classifier, AUC value significantly increased to 0.96 ± 0.03 (p < 0.01). Conclusions: This study demonstrated that quantitative analysis of kinetic image features computed from breast MR images acquired prechemotherapy has potential to generate a useful clinical marker in predicting tumor response to chemotherapy.

  1. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    International Nuclear Information System (INIS)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo

    2016-01-01

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment

  2. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo [Gachon University Gil Hospital, Incheon (Korea, Republic of)

    2016-06-15

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment.

  3. Computation of Dielectric Response in Molecular Solids for High Capacitance Organic Dielectrics.

    Science.gov (United States)

    Heitzer, Henry M; Marks, Tobin J; Ratner, Mark A

    2016-09-20

    The dielectric response of a material is central to numerous processes spanning the fields of chemistry, materials science, biology, and physics. Despite this broad importance across these disciplines, describing the dielectric environment of a molecular system at the level of first-principles theory and computation remains a great challenge and is of importance to understand the behavior of existing systems as well as to guide the design and synthetic realization of new ones. Furthermore, with recent advances in molecular electronics, nanotechnology, and molecular biology, it has become necessary to predict the dielectric properties of molecular systems that are often difficult or impossible to measure experimentally. In these scenarios, it is would be highly desirable to be able to determine dielectric response through efficient, accurate, and chemically informative calculations. A good example of where theoretical modeling of dielectric response would be valuable is in the development of high-capacitance organic gate dielectrics for unconventional electronics such as those that could be fabricated by high-throughput printing techniques. Gate dielectrics are fundamental components of all transistor-based logic circuitry, and the combination high dielectric constant and nanoscopic thickness (i.e., high capacitance) is essential to achieving high switching speeds and low power consumption. Molecule-based dielectrics offer the promise of cheap, flexible, and mass producible electronics when used in conjunction with unconventional organic or inorganic semiconducting materials to fabricate organic field effect transistors (OFETs). The molecular dielectrics developed to date typically have limited dielectric response, which results in low capacitances, translating into poor performance of the resulting OFETs. Furthermore, the development of better performing dielectric materials has been hindered by the current highly empirical and labor-intensive pace of synthetic

  4. Experimental and computational analysis of pressure response in a multiphase flow loop

    Science.gov (United States)

    Morshed, Munzarin; Amin, Al; Rahman, Mohammad Azizur; Imtiaz, Syed

    2016-07-01

    The characteristics of multiphase fluid flow in pipes are useful to understand fluid mechanics encountered in the oil and gas industries. In the present day oil and gas exploration is successively inducing subsea operation in the deep sea and arctic condition. During the transport of petroleum products, understanding the fluid dynamics inside the pipe network is important for flow assurance. In this case the information regarding static and dynamic pressure response, pressure loss, optimum flow rate, pipe diameter etc. are the important parameter for flow assurance. The principal aim of this research is to represents computational analysis and experimental analysis of multi-phase (L/G) in a pipe network. This computational study considers a two-phase fluid flow through a horizontal flow loop with at different Reynolds number in order to determine the pressure distribution, frictional pressure loss profiles by volume of fluid (VOF) method. However, numerical simulations are validated with the experimental data. The experiment is conducted in 76.20 mm ID transparent circular pipe using water and air in the flow loop. Static pressure transducers are used to measure local pressure response in multiphase pipeline.

  5. INTRANS. A computer code for the non-linear structural response analysis of reactor internals under transient loads

    International Nuclear Information System (INIS)

    Ramani, D.T.

    1977-01-01

    The 'INTRANS' system is a general purpose computer code, designed to perform linear and non-linear structural stress and deflection analysis of impacting or non-impacting nuclear reactor internals components coupled with reactor vessel, shield building and external as well as internal gapped spring support system. This paper describes in general a unique computational procedure for evaluating the dynamic response of reactor internals, descretised as beam and lumped mass structural system and subjected to external transient loads such as seismic and LOCA time-history forces. The computational procedure is outlined in the INTRANS code, which computes component flexibilities of a discrete lumped mass planar model of reactor internals by idealising an assemblage of finite elements consisting of linear elastic beams with bending, torsional and shear stiffnesses interacted with external or internal linear as well as non-linear multi-gapped spring support system. The method of analysis is based on the displacement method and the code uses the fourth-order Runge-Kutta numerical integration technique as a basis for solution of dynamic equilibrium equations of motion for the system. During the computing process, the dynamic response of each lumped mass is calculated at specific instant of time using well-known step-by-step procedure. At any instant of time then, the transient dynamic motions of the system are held stationary and based on the predicted motions and internal forces of the previous instant. From which complete response at any time-step of interest may then be computed. Using this iterative process, the relationship between motions and internal forces is satisfied step by step throughout the time interval

  6. Towards an integrative computational model for simulating tumor growth and response to radiation therapy

    Science.gov (United States)

    Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar

    2017-11-01

    Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.

  7. Impaired Expected Value Computations Coupled With Overreliance on Stimulus-Response Learning in Schizophrenia.

    Science.gov (United States)

    Hernaus, Dennis; Gold, James M; Waltz, James A; Frank, Michael J

    2018-04-03

    While many have emphasized impaired reward prediction error signaling in schizophrenia, multiple studies suggest that some decision-making deficits may arise from overreliance on stimulus-response systems together with a compromised ability to represent expected value. Guided by computational frameworks, we formulated and tested two scenarios in which maladaptive representations of expected value should be most evident, thereby delineating conditions that may evoke decision-making impairments in schizophrenia. In a modified reinforcement learning paradigm, 42 medicated people with schizophrenia and 36 healthy volunteers learned to select the most frequently rewarded option in a 75-25 pair: once when presented with a more deterministic (90-10) pair and once when presented with a more probabilistic (60-40) pair. Novel and old combinations of choice options were presented in a subsequent transfer phase. Computational modeling was employed to elucidate contributions from stimulus-response systems (actor-critic) and expected value (Q-learning). People with schizophrenia showed robust performance impairments with increasing value difference between two competing options, which strongly correlated with decreased contributions from expected value-based learning (Q-learning). Moreover, a subtle yet consistent contextual choice bias for the probabilistic 75 option was present in people with schizophrenia, which could be accounted for by a context-dependent reward prediction error in the actor-critic. We provide evidence that decision-making impairments in schizophrenia increase monotonically with demands placed on expected value computations. A contextual choice bias is consistent with overreliance on stimulus-response learning, which may signify a deficit secondary to the maladaptive representation of expected value. These results shed new light on conditions under which decision-making impairments may arise. Copyright © 2018 Society of Biological Psychiatry. Published by

  8. Impulse-response analysis of planar computed tomography for nondestructive test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Cheon; Kim, Seung Ho; Kim, Ho Kyung [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There have been reported that the use of radiation imaging such as digital radiography, computed tomography (CT), and digital tomosynthesis (DTS) for the nondestructive test (NDT) widely is spreading. These methods have merits and demerits of their own, in terms of image quality and inspection speed. Therefore, image for these methods for NDT should have acceptable image quality and high speed. In this study, we quantitatively evaluate impulse responses of reconstructed images from the filtered backprojection (FBP), which are most widely used in planar computed tomography (pCT) systems. We first evaluate image performance metrics due to the contrast, depth resolution, and then we design the figure of merit including image performance and system parameters, such as tube load and reconstruction speed. The final goal of this study is the application of these methods to the nondestructive test. In order to accomplish it, further study is needed. First of all, the results of the ASF from various numbers of views. Second, the analysis of modulation transfer function, noise power spectrum, and detective quantum efficiency from various angular range and numbers of views.

  9. The quantitative assessment of peri-implant bone responses using histomorphometry and micro-computed tomography.

    NARCIS (Netherlands)

    Schouten, C.; Meijer, G.J.; Beucken, J.J.J.P van den; Spauwen, P.H.M.; Jansen, J.A.

    2009-01-01

    In the present study, the effects of implant design and surface properties on peri-implant bone response were evaluated with both conventional histomorphometry and micro-computed tomography (micro-CT), using two geometrically different dental implants (Screw type, St; Push-in, Pi) either or not

  10. Molecular Imaging and Precision Medicine: PET/Computed Tomography and Therapy Response Assessment in Oncology.

    Science.gov (United States)

    Sheikhbahaei, Sara; Mena, Esther; Pattanayak, Puskar; Taghipour, Mehdi; Solnes, Lilja B; Subramaniam, Rathan M

    2017-01-01

    A variety of methods have been developed to assess tumor response to therapy. Standardized qualitative criteria based on 18F-fluoro-deoxyglucose PET/computed tomography have been proposed to evaluate the treatment effectiveness in specific cancers and these allow more accurate therapy response assessment and survival prognostication. Multiple studies have addressed the utility of the volumetric PET biomarkers as prognostic indicators but there is no consensus about the preferred segmentation methodology for these metrics. Heterogeneous intratumoral uptake was proposed as a novel PET metric for therapy response assessment. PET imaging techniques will be used to study the biological behavior of cancers during therapy. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  12. On computing the geoelastic response to a disk load

    Science.gov (United States)

    Bevis, M.; Melini, D.; Spada, G.

    2016-06-01

    We review the theory of the Earth's elastic and gravitational response to a surface disk load. The solutions for displacement of the surface and the geoid are developed using expansions of Legendre polynomials, their derivatives and the load Love numbers. We provide a MATLAB function called diskload that computes the solutions for both uncompensated and compensated disk loads. In order to numerically implement the Legendre expansions, it is necessary to choose a harmonic degree, nmax, at which to truncate the series used to construct the solutions. We present a rule of thumb (ROT) for choosing an appropriate value of nmax, describe the consequences of truncating the expansions prematurely and provide a means to judiciously violate the ROT when that becomes a practical necessity.

  13. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    Science.gov (United States)

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  14. A noninvasive brain computer interface using visually-induced near-infrared spectroscopy responses.

    Science.gov (United States)

    Chen, Cheng-Hsuan; Ho, Ming-Shan; Shyu, Kuo-Kai; Hsu, Kou-Cheng; Wang, Kuo-Wei; Lee, Po-Lei

    2014-09-19

    Visually-induced near-infrared spectroscopy (NIRS) response was utilized to design a brain computer interface (BCI) system. Four circular checkerboards driven by distinct flickering sequences were displayed on a LCD screen as visual stimuli to induce subjects' NIRS responses. Each flickering sequence was a concatenated sequence of alternative flickering segments and resting segments. The flickering segment was designed with fixed duration of 3s whereas the resting segment was chosen randomly within 15-20s to create the mutual independencies among different flickering sequences. Six subjects were recruited in this study and subjects were requested to gaze at the four visual stimuli one-after-one in a random order. Since visual responses in human brain are time-locked to the onsets of visual stimuli and the flicker sequences of distinct visual stimuli were designed mutually independent, the NIRS responses induced by user's gazed targets can be discerned from non-gazed targets by applying a simple averaging process. The accuracies for the six subjects were higher than 90% after 10 or more epochs being averaged. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Computer-assisted detection (CAD) methodology for early detection of response to pharmaceutical therapy in tuberculosis patients

    Science.gov (United States)

    Lieberman, Robert; Kwong, Heston; Liu, Brent; Huang, H. K.

    2009-02-01

    The chest x-ray radiological features of tuberculosis patients are well documented, and the radiological features that change in response to successful pharmaceutical therapy can be followed with longitudinal studies over time. The patients can also be classified as either responsive or resistant to pharmaceutical therapy based on clinical improvement. We have retrospectively collected time series chest x-ray images of 200 patients diagnosed with tuberculosis receiving the standard pharmaceutical treatment. Computer algorithms can be created to utilize image texture features to assess the temporal changes in the chest x-rays of the tuberculosis patients. This methodology provides a framework for a computer-assisted detection (CAD) system that may provide physicians with the ability to detect poor treatment response earlier in pharmaceutical therapy. Early detection allows physicians to respond with more timely treatment alternatives and improved outcomes. Such a system has the potential to increase treatment efficacy for millions of patients each year.

  16. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  17. Computer-mediated communication and time pressure induce higher cardiovascular responses in the preparatory and execution phases of cooperative tasks.

    Science.gov (United States)

    Costa Ferrer, Raquel; Serrano Rosa, Miguel Ángel; Zornoza Abad, Ana; Salvador Fernández-Montejo, Alicia

    2010-11-01

    The cardiovascular (CV) response to social challenge and stress is associated with the etiology of cardiovascular diseases. New ways of communication, time pressure and different types of information are common in our society. In this study, the cardiovascular response to two different tasks (open vs. closed information) was examined employing different communication channels (computer-mediated vs. face-to-face) and with different pace control (self vs. external). Our results indicate that there was a higher CV response in the computer-mediated condition, on the closed information task and in the externally paced condition. These role of these factors should be considered when studying the consequences of social stress and their underlying mechanisms.

  18. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  19. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  20. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    Science.gov (United States)

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell

  1. A qualitatively validated mathematical-computational model of the immune response to the yellow fever vaccine.

    Science.gov (United States)

    Bonin, Carla R B; Fernandes, Guilherme C; Dos Santos, Rodrigo W; Lobosco, Marcelo

    2018-05-25

    Although a safe and effective yellow fever vaccine was developed more than 80 years ago, several issues regarding its use remain unclear. For example, what is the minimum dose that can provide immunity against the disease? A useful tool that can help researchers answer this and other related questions is a computational simulator that implements a mathematical model describing the human immune response to vaccination against yellow fever. This work uses a system of ten ordinary differential equations to represent a few important populations in the response process generated by the body after vaccination. The main populations include viruses, APCs, CD8+ T cells, short-lived and long-lived plasma cells, B cells and antibodies. In order to qualitatively validate our model, four experiments were carried out, and their computational results were compared to experimental data obtained from the literature. The four experiments were: a) simulation of a scenario in which an individual was vaccinated against yellow fever for the first time; b) simulation of a booster dose ten years after the first dose; c) simulation of the immune response to the yellow fever vaccine in individuals with different levels of naïve CD8+ T cells; and d) simulation of the immune response to distinct doses of the yellow fever vaccine. This work shows that the simulator was able to qualitatively reproduce some of the experimental results reported in the literature, such as the amount of antibodies and viremia throughout time, as well as to reproduce other behaviors of the immune response reported in the literature, such as those that occur after a booster dose of the vaccine.

  2. Computational Analysis of Single Nucleotide Polymorphisms Associated with Altered Drug Responsiveness in Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Valerio Costa

    2016-06-01

    Full Text Available Type 2 diabetes (T2D is one of the most frequent mortality causes in western countries, with rapidly increasing prevalence. Anti-diabetic drugs are the first therapeutic approach, although many patients develop drug resistance. Most drug responsiveness variability can be explained by genetic causes. Inter-individual variability is principally due to single nucleotide polymorphisms, and differential drug responsiveness has been correlated to alteration in genes involved in drug metabolism (CYP2C9 or insulin signaling (IRS1, ABCC8, KCNJ11 and PPARG. However, most genome-wide association studies did not provide clues about the contribution of DNA variations to impaired drug responsiveness. Thus, characterizing T2D drug responsiveness variants is needed to guide clinicians toward tailored therapeutic approaches. Here, we extensively investigated polymorphisms associated with altered drug response in T2D, predicting their effects in silico. Combining different computational approaches, we focused on the expression pattern of genes correlated to drug resistance and inferred evolutionary conservation of polymorphic residues, computationally predicting the biochemical properties of polymorphic proteins. Using RNA-Sequencing followed by targeted validation, we identified and experimentally confirmed that two nucleotide variations in the CAPN10 gene—currently annotated as intronic—fall within two new transcripts in this locus. Additionally, we found that a Single Nucleotide Polymorphism (SNP, currently reported as intergenic, maps to the intron of a new transcript, harboring CAPN10 and GPR35 genes, which undergoes non-sense mediated decay. Finally, we analyzed variants that fall into non-coding regulatory regions of yet underestimated functional significance, predicting that some of them can potentially affect gene expression and/or post-transcriptional regulation of mRNAs affecting the splicing.

  3. Computational Model and Numerical Simulation for Submerged Mooring Monitoring Platform’s Dynamical Response

    Directory of Open Access Journals (Sweden)

    He Kongde

    2015-01-01

    Full Text Available Computational model and numerical simulation for submerged mooring monitoring platform were formulated aimed at the dynamical response by the action of flow force, which based on Hopkinson impact load theory, taken into account the catenoid effect of mooring cable and revised the difference of tension and tangential direction action force by equivalent modulus of elasticity. Solved the equation by hydraulics theory and structural mechanics theory of oceaneering, studied the response of buoy on flow force. The validity of model were checked and the results were in good agreement; the result show the buoy will engender biggish heave and swaying displacement, but the swaying displacement got stable quickly and the heaven displacement cause vibration for the vortex-induced action by the flow.

  4. SHOCK-JR: a computer program to analyze impact response of shipping container

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nakazato, Chikara; Shimoda, Osamu; Uchino, Mamoru.

    1983-02-01

    The report is provided for using a computer program, SHOCK-JR, which is used to analyze the impact response of shipping containers. Descriptions are the mathematical model, method of analysis, structures of the program and the input and output variables. The program solves the equations of motion for a one-dimensional, lumped mass and nonlinear spring model. The solution procedure uses Runge-Kutta-Gill and Newmark-β methods. SHOCK-JR is a revised version of SHOCK, which was developed by ORNL. In SHOCK-JR, SI dimension is used and graphical output is available. (author)

  5. Children's Responses to Computer-Synthesized Speech in Educational Media: Gender Consistency and Gender Similarity Effects

    Science.gov (United States)

    Lee, Kwan Min; Liao, Katharine; Ryu, Seoungho

    2007-01-01

    This study examines children's social responses to gender cues in synthesized speech in a computer-based instruction setting. Eighty 5th-grade elementary school children were randomly assigned to one of the conditions in a full-factorial 2 (participant gender) x 2 (voice gender) x 2 (content gender) experiment. Results show that children apply…

  6. Three-dimensional computer code for the nonlinear dynamic response of an HTGR core

    International Nuclear Information System (INIS)

    Subudhi, M.; Lasker, L.; Koplik, B.; Curreri, J.; Goradia, H.

    1979-01-01

    A three-dimensional dynamic code has been developed to determine the nonlinear response of an HTGR core. The HTGR core consists of several thousands of hexagonal core blocks. These are arranged inlayers stacked together. Each layer contains many core blocks surrounded on their outer periphery by reflector blocks. The entire assembly is contained within a prestressed concrete reactor vessel. Gaps exist between adjacent blocks in any horizontal plane. Each core block in a given layer is connected to the blocks directly above and below it via three dowell pins. The present analystical study is directed towards an invesstigation of the nonlinear response of the reactor core blocks in the event of a seismic occurrence. The computer code is developed for a specific mathemtical model which represents a vertical arrangement of layers of blocks. This comprises a block module of core elements which would be obtained by cutting a cylindrical portion consisting of seven fuel blocks per layer. It is anticipated that a number of such modules properly arranged could represent the entire core. Hence, the predicted response of this module would exhibit the response characteristics of the core

  7. Semiquantitative visual approach to scoring lung cancer treatment response using computed tomography: a pilot study.

    Science.gov (United States)

    Gottlieb, Ronald H; Kumar, Prasanna; Loud, Peter; Klippenstein, Donald; Raczyk, Cheryl; Tan, Wei; Lu, Jenny; Ramnath, Nithya

    2009-01-01

    Our objective was to compare a newly developed semiquantitative visual scoring (SVS) method with the current standard, the Response Evaluation Criteria in Solid Tumors (RECIST) method, in the categorization of treatment response and reader agreement for patients with metastatic lung cancer followed by computed tomography. The 18 subjects (5 women and 13 men; mean age, 62.8 years) were from an institutional review board-approved phase 2 study that evaluated a second-line chemotherapy regimen for metastatic (stages III and IV) non-small cell lung cancer. Four radiologists, blinded to the patient outcome and each other's reads, evaluated the change in the patients' tumor burden from the baseline to the first restaging computed tomographic scan using either the RECIST or the SVS method. We compared the numbers of patients placed into the partial response, the stable disease (SD), and the progressive disease (PD) categories (Fisher exact test) and observer agreement (kappa statistic). Requiring the concordance of 3 of the 4 readers resulted in the RECIST placing 17 (100%) of 17 patients in the SD category compared with the SVS placing 9 (60%) of 15 patients in the partial response, 5 (33%) of the 15 patients in the SD, and 1 (6.7%) of the 15 patients in the PD categories (P < 0.0001). Interobserver agreement was higher among the readers using the SVS method (kappa, 0.54; P < 0.0001) compared with that of the readers using the RECIST method (kappa, -0.01; P = 0.5378). Using the SVS method, the readers more finely discriminated between the patient response categories with superior agreement compared with the RECIST method, which could potentially result in large differences in early treatment decisions for advanced lung cancer.

  8. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  9. Positron computed tomography studies of cerebral metabolic responses to complex motor tasks

    International Nuclear Information System (INIS)

    Phelps, M.E.; Mazziotta, J.C.

    1984-01-01

    Human motor system organization was explored in 8 right-handed male subjects using /sup 18/F-fluorodeoxyglucose and positron computed tomography to measure cerebral glucose metabolism. Five subjects had triple studies (eyes closed) including: control (hold pen in right hand without moving), normal size writing (subject repeatedly writes name) and large (10-15 X normal) name writing. In these studies normal and large size writing had a similar distribution of metabolic responses when compared to control studies. Activations (percent change from control) were in the range of 12-20% and occurred in the striatum bilaterally > contralateral Rolandic cortex > contralateral thalamus. No significant activations were observed in the ipsilateral thalamus, Rolandic cortex or cerebellum (supplementary motor cortex was not examined). The magnitude of the metabolic response in the striatum was greater with the large versus normal sized writing. This differential response may be due to an increased number and topographic distribution of neurons responding with the same average activity between tasks or an increase in the functional activity of the same neuronal population between the two tasks (present spatial resolution inadequate to differentiate). When subjects (N=3) performed novel sequential finger movements, the maximal metabolic response was in the contralateral Rolandic cortex > striatum. Such studies provide a means of exploring human motor system organization, motor learning and provide a basis for examining patients with motor system disorders

  10. Positron emission tomography/computed tomography and biomarkers for early treatment response evaluation in metastatic colon cancer

    DEFF Research Database (Denmark)

    Engelmann, Bodil E.; Loft, Annika; Kjær, Andreas

    2014-01-01

    BACKGROUND: Treatment options for metastatic colon cancer (mCC) are widening. We prospectively evaluated serial 2-deoxy-2-[18F]fluoro-d-glucose positron-emission tomography/computed tomography (PET/CT) and measurements of tissue inhibitor of metalloproteinases-1 (TIMP-1), carcinoembryonic antigen...... evaluated by PET/CT before treatment, after one and four treatment series. Morphological and metabolic response was independently assessed according to Response Evaluation Criteria in Solid Tumors and European Organization for Research and Treatment of Cancer PET criteria. Plasma TIMP-1, plasma u...

  11. Analysis of the computational methods on the equipment shock response based on ANSYS environments

    International Nuclear Information System (INIS)

    Wang Yu; Li Zhaojun

    2005-01-01

    With the developments and completions of equipment shock vibration theory, math calculation method simulation technique and other aspects, equipment shock calculation methods are gradually developing form static development to dynamic and from linearity to non-linearity. Now, the equipment shock calculation methods applied worldwide in engineering practices mostly include equivalent static force method, Dynamic Design Analysis Method (abbreviated to DDAM) and real-time simulation method. The DDAM is a method based on the modal analysis theory, which inputs the shock design spectrum as shock load and gets hold of the shock response of the integrated system by applying separate cross-modal integrating method within the frequency domain. The real-time simulation method is to carry through the computational analysis of the equipment shock response within the time domain, use the time-history curves obtained from real-time measurement or spectrum transformation as the equipment shock load and find an iterative solution of a differential equation of the system movement by using the computational procedure within the time domain. Conclusions: Using the separate DDAM and Real-time Simulation Method, this paper carried through the shock analysis of a three-dimensional frame floating raft in ANSYS environments, analyzed the result, and drew the following conclusion: Because DDAM does not calculate damping, non-linear effect and phase difference between mode responses, the result is much bigger than that of real-time simulation method. The coupling response is much complex when the mode result of 3-dimension structure is being calculated, and the coupling response of non-shock direction is also much bigger than that of real-time simulation method when DDAM is applied. Both DDAM and real-time simulation method has its good points and scope of application. The designers should select the design method that is economic and in point according to the features and anti

  12. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  13. Parallel Implementation of Triangular Cellular Automata for Computing Two-Dimensional Elastodynamic Response on Arbitrary Domains

    Science.gov (United States)

    Leamy, Michael J.; Springer, Adam C.

    In this research we report parallel implementation of a Cellular Automata-based simulation tool for computing elastodynamic response on complex, two-dimensional domains. Elastodynamic simulation using Cellular Automata (CA) has recently been presented as an alternative, inherently object-oriented technique for accurately and efficiently computing linear and nonlinear wave propagation in arbitrarily-shaped geometries. The local, autonomous nature of the method should lead to straight-forward and efficient parallelization. We address this notion on symmetric multiprocessor (SMP) hardware using a Java-based object-oriented CA code implementing triangular state machines (i.e., automata) and the MPI bindings written in Java (MPJ Express). We use MPJ Express to reconfigure our existing CA code to distribute a domain's automata to cores present on a dual quad-core shared-memory system (eight total processors). We note that this message passing parallelization strategy is directly applicable to computer clustered computing, which will be the focus of follow-on research. Results on the shared memory platform indicate nearly-ideal, linear speed-up. We conclude that the CA-based elastodynamic simulator is easily configured to run in parallel, and yields excellent speed-up on SMP hardware.

  14. Experimental and computational investigation of lateral gauge response in polycarbonate

    Science.gov (United States)

    Eliot, Jim; Harris, Ernst; Hazell, Paul; Appleby-Thomas, Gareth; Winter, Ronald; Wood, David; Owen, Gareth

    2011-06-01

    Polycarbonate's use in personal armour systems means its high strain-rate response has been extensively studied. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges need to be embedded in a central (typically) epoxy interlayer - an inherently invasive approach. Recently, research has suggested that in such metal systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. This work extends previous efforts such that similar impedance exists between the interlayer and matrix (target) interface. Further, experiments and simulations are presented investigating the effects of a ``dry joint'' in polycarbonate, in which no encapsulating medium is employed.

  15. Computer-aided global breast MR image feature analysis for prediction of tumor response to chemotherapy: performance assessment

    Science.gov (United States)

    Aghaei, Faranak; Tan, Maxine; Hollingsworth, Alan B.; Zheng, Bin; Cheng, Samuel

    2016-03-01

    Dynamic contrast-enhanced breast magnetic resonance imaging (DCE-MRI) has been used increasingly in breast cancer diagnosis and assessment of cancer treatment efficacy. In this study, we applied a computer-aided detection (CAD) scheme to automatically segment breast regions depicting on MR images and used the kinetic image features computed from the global breast MR images acquired before neoadjuvant chemotherapy to build a new quantitative model to predict response of the breast cancer patients to the chemotherapy. To assess performance and robustness of this new prediction model, an image dataset involving breast MR images acquired from 151 cancer patients before undergoing neoadjuvant chemotherapy was retrospectively assembled and used. Among them, 63 patients had "complete response" (CR) to chemotherapy in which the enhanced contrast levels inside the tumor volume (pre-treatment) was reduced to the level as the normal enhanced background parenchymal tissues (post-treatment), while 88 patients had "partially response" (PR) in which the high contrast enhancement remain in the tumor regions after treatment. We performed the studies to analyze the correlation among the 22 global kinetic image features and then select a set of 4 optimal features. Applying an artificial neural network trained with the fusion of these 4 kinetic image features, the prediction model yielded an area under ROC curve (AUC) of 0.83+/-0.04. This study demonstrated that by avoiding tumor segmentation, which is often difficult and unreliable, fusion of kinetic image features computed from global breast MR images without tumor segmentation can also generate a useful clinical marker in predicting efficacy of chemotherapy.

  16. A computational relationship between thalamic sensory neural responses and contrast perception.

    Science.gov (United States)

    Jiang, Yaoguang; Purushothaman, Gopathy; Casagrande, Vivien A

    2015-01-01

    Uncovering the relationship between sensory neural responses and perceptual decisions remains a fundamental problem in neuroscience. Decades of experimental and modeling work in the sensory cortex have demonstrated that a perceptual decision pool is usually composed of tens to hundreds of neurons, the responses of which are significantly correlated not only with each other, but also with the behavioral choices of an animal. Few studies, however, have measured neural activity in the sensory thalamus of awake, behaving animals. Therefore, it remains unclear how many thalamic neurons are recruited and how the information from these neurons is pooled at subsequent cortical stages to form a perceptual decision. In a previous study we measured neural activity in the macaque lateral geniculate nucleus (LGN) during a two alternative forced choice (2AFC) contrast detection task, and found that single LGN neurons were significantly correlated with the monkeys' behavioral choices, despite their relatively poor contrast sensitivity and a lack of overall interneuronal correlations. We have now computationally tested a number of specific hypotheses relating these measured LGN neural responses to the contrast detection behavior of the animals. We modeled the perceptual decisions with different numbers of neurons and using a variety of pooling/readout strategies, and found that the most successful model consisted of about 50-200 LGN neurons, with individual neurons weighted differentially according to their signal-to-noise ratios (quantified as d-primes). These results supported the hypothesis that in contrast detection the perceptual decision pool consists of multiple thalamic neurons, and that the response fluctuations in these neurons can influence contrast perception, with the more sensitive thalamic neurons likely to exert a greater influence.

  17. Efficient sparse matrix-matrix multiplication for computing periodic responses by shooting method on Intel Xeon Phi

    Science.gov (United States)

    Stoykov, S.; Atanassov, E.; Margenov, S.

    2016-10-01

    Many of the scientific applications involve sparse or dense matrix operations, such as solving linear systems, matrix-matrix products, eigensolvers, etc. In what concerns structural nonlinear dynamics, the computations of periodic responses and the determination of stability of the solution are of primary interest. Shooting method iswidely used for obtaining periodic responses of nonlinear systems. The method involves simultaneously operations with sparse and dense matrices. One of the computationally expensive operations in the method is multiplication of sparse by dense matrices. In the current work, a new algorithm for sparse matrix by dense matrix products is presented. The algorithm takes into account the structure of the sparse matrix, which is obtained by space discretization of the nonlinear Mindlin's plate equation of motion by the finite element method. The algorithm is developed to use the vector engine of Intel Xeon Phi coprocessors. It is compared with the standard sparse matrix by dense matrix algorithm and the one developed by Intel MKL and it is shown that by considering the properties of the sparse matrix better algorithms can be developed.

  18. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  19. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  20. Computational micromechanics analysis of electron hopping and interfacial damage induced piezoresistive response in carbon nanotube-polymer nanocomposites

    International Nuclear Information System (INIS)

    Chaurasia, A K; Seidel, G D; Ren, X

    2014-01-01

    Carbon nanotube (CNT)-polymer nanocomposites have been observed to exhibit an effective macroscale piezoresistive response, i.e., change in macroscale resistivity when subjected to applied deformation. The macroscale piezoresistive response of CNT-polymer nanocomposites leads to deformation/strain sensing capabilities. It is believed that the nanoscale phenomenon of electron hopping is the major driving force behind the observed macroscale piezoresistivity of such nanocomposites. Additionally, CNT-polymer nanocomposites provide damage sensing capabilities because of local changes in electron hopping pathways at the nanoscale because of initiation/evolution of damage. The primary focus of the current work is to explore the effect of interfacial separation and damage at the nanoscale CNT-polymer interface on the effective macroscale piezoresistive response. Interfacial separation and damage are allowed to evolve at the CNT-polymer interface through coupled electromechanical cohesive zones, within a finite element based computational micromechanics framework, resulting in electron hopping based current density across the separated CNT-polymer interface. The macroscale effective material properties and gauge factors are evaluated using micromechanics techniques based on electrostatic energy equivalence. The impact of the electron hopping mechanism, nanoscale interface separation and damage evolution on the effective nanocomposite electrostatic and piezoresistive response is studied in comparison with the perfectly bonded interface. The effective electrostatic/piezoresistive response for the perfectly bonded interface is obtained based on a computational micromechanics model developed in the authors’ earlier work. It is observed that the macroscale effective gauge factors are highly sensitive to strain induced formation/disruption of electron hopping pathways, interface separation and the initiation/evolution of interfacial damage. (paper)

  1. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    Science.gov (United States)

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A three-dimensional computer code for the nonlinear dynamic response of an HTGR core

    International Nuclear Information System (INIS)

    Subudhi, M.; Lasker, L.; Koplik, B.; Curreri, J.; Goradia, H.

    1979-01-01

    A three-dimensional dynamic code has been developed to determine the nonlinear response of an HTGR core. The HTGR core consists of several thousands of hexagonal core blocks. These are arranged in layers stacked together. Each layer contains many core blocks surrounded on their outer periphery by reflector blocks. The entire assembly is contained within a prestressed concrete reactor vessel. Gaps exist between adjacent blocks in any horizontal plane. Each core block in a given layer is connected to the blocks directly above and below it via three dowell pins. The present analytical study is directed towards an investigation of the nonlinear response of the reactor core blocks in the event of a seismic occurrence. The computer code is developed for a specific mathematical model which represents a vertical arrangement of layers of blocks. This comprises a 'block module' of core elements which would be obtained by cutting a cylindrical portion consisting of seven fuel blocks per layer. It is anticipated that a number of such modules properly arranged could represent the entire core. Hence, the predicted response of this module would exhibit the response characteristics of the core. (orig.)

  3. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  4. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  5. Topics in Modeling of Cochlear Dynamics: Computation, Response and Stability Analysis

    Science.gov (United States)

    Filo, Maurice G.

    This thesis touches upon several topics in cochlear modeling. Throughout the literature, mathematical models of the cochlea vary according to the degree of biological realism to be incorporated. This thesis casts the cochlear model as a continuous space-time dynamical system using operator language. This framework encompasses a wider class of cochlear models and makes the dynamics more transparent and easier to analyze before applying any numerical method to discretize space. In fact, several numerical methods are investigated to study the computational efficiency of the finite dimensional realizations in space. Furthermore, we study the effects of the active gain perturbations on the stability of the linearized dynamics. The stability analysis is used to explain possible mechanisms underlying spontaneous otoacoustic emissions and tinnitus. Dynamic Mode Decomposition (DMD) is introduced as a useful tool to analyze the response of nonlinear cochlear models. Cochlear response features are illustrated using DMD which has the advantage of explicitly revealing the spatial modes of vibrations occurring in the Basilar Membrane (BM). Finally, we address the dynamic estimation problem of BM vibrations using Extended Kalman Filters (EKF). Due to the limitations of noninvasive sensing schemes, such algorithms are inevitable to estimate the dynamic behavior of a living cochlea.

  6. A comparison of computational models with and without genotyping for prediction of response to second-line HIV therapy

    NARCIS (Netherlands)

    Revell, A. D.; Boyd, M. A.; Wang, D.; Emery, S.; Gazzard, B.; Reiss, P.; van Sighem, A. I.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2014-01-01

    We compared the use of computational models developed with and without HIV genotype vs. genotyping itself to predict effective regimens for patients experiencing first-line virological failure. Two sets of models predicted virological response for 99 three-drug regimens for patients on a failing

  7. Computational Model of Antidepressant Response Heterogeneity as Multi-pathway Neuroadaptation

    Directory of Open Access Journals (Sweden)

    Mariam B. Camacho

    2017-12-01

    Full Text Available Current hypotheses cannot fully explain the clinically observed heterogeneity in antidepressant response. The therapeutic latency of antidepressants suggests that therapeutic outcomes are achieved not by the acute effects of the drugs, but rather by the homeostatic changes that occur as the brain adapts to their chronic administration. We present a computational model that represents the known interactions between the monoaminergic neurotransmitter-producing brain regions and associated non-monoaminergic neurotransmitter systems, and use the model to explore the possible ways in which the brain can homeostatically adjust to chronic antidepressant administration. The model also represents the neuron-specific neurotransmitter receptors that are known to adjust their strengths (expressions or sensitivities in response to chronic antidepressant administration, and neuroadaptation in the model occurs through sequential adjustments in these receptor strengths. The main result is that the model can reach similar levels of adaptation to chronic administration of the same antidepressant drug or combination along many different pathways, arriving correspondingly at many different receptor strength configurations, but not all of those adapted configurations are also associated with therapeutic elevations in monoamine levels. When expressed as the percentage of adapted configurations that are also associated with elevations in one or more of the monoamines, our modeling results largely agree with the percentage efficacy rates of antidepressants and antidepressant combinations observed in clinical trials. Our neuroadaptation model provides an explanation for the clinical reports of heterogeneous outcomes among patients chronically administered the same antidepressant drug regimen.

  8. Fast neutron detection with germanium detectors: computation of response functions for the 692 keV inelastic scattering peak

    International Nuclear Information System (INIS)

    Fehrenbacher, G.; Meckbach, R.; Paretzke, H.G.

    1996-01-01

    The dependence of the shape of the right-sided broadening of the inelastic scattering peak at 692 keV in the pulse-height distribution measured with a Ge detector in fast neutron fields on the energy of the incident neutrons has been analyzed. A model incorporating the process contributing to the energy deposition that engender the peak, including the partitioning of the energy deposition by the Ge recoils, was developed. With a Monte Carlo code based on this model, the detector response associated with this peak was computed and compared with results of measurements with quasi-monoenergetic neutrons for energies between 0.88 and 2.1 MeV. A set of 80 response functions for neutron energies in the range from the reaction threshold at 0.7 to 6 MeV was computed, which will serve as a starting point for methods, which aim at obtaining information on the spectral distribution of fast neutron fields for this energy range from measurements with a Ge detector. (orig.)

  9. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  10. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  11. Adaptive Response in Female Fathead Minnows Exposed to an Aromatase Inhibitor: Computational Modeling of the Hypothalamic-Pituitary-Gonadal Axis

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  12. Becoming Technosocial Change Agents: Intersectionality and Culturally Responsive Pedagogies as Vital Resources for Increasing Girls' Participation in Computing

    Science.gov (United States)

    Ashcraft, Catherine; Eger, Elizabeth K.; Scott, Kimberly A.

    2017-01-01

    Drawing from our two-year ethnography, we juxtapose the experiences of two cohorts in one culturally responsive computing program, examining how the program fostered girls' emerging identities as technosocial change agents. In presenting this in-depth and up-close exploration, we simultaneously identify conditions that both facilitated and limited…

  13. Cloud Computing in Support of Synchronized Disaster Response Operations

    Science.gov (United States)

    2010-09-01

    scalable, Web application based on cloud computing technologies to facilitate communication between a broad range of public and private entities without...requiring them to compromise security or competitive advantage. The proposed design applies the unique benefits of cloud computing architectures such as

  14. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human-c...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society.......We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human......-centred and societal challenges and impacts involved in students’ creative and critical engagement with digital technology. Our research is based on the FabLab@School project, in which a PD approach to computational empowerment provided opportunities as well as further challenges for the complex agenda of digital...

  15. Social Skills Instruction for Urban Learners with Emotional and Behavioral Disorders: A Culturally Responsive and Computer-Based Intervention

    Science.gov (United States)

    Robinson-Ervin, Porsha; Cartledge, Gwendolyn; Musti-Rao, Shobana; Gibson, Lenwood, Jr.; Keyes, Starr E.

    2016-01-01

    This study examined the effects of culturally relevant/responsive, computer-based social skills instruction on the social skill acquisition and generalization of 6 urban African American sixth graders with emotional and behavioral disorders (EBD). A multiple-probe across participants design was used to evaluate the effects of the social skills…

  16. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  17. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  18. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  19. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  20. Computational Modeling of Hypothalamic-Pituitary-Gonadal Axis to Predict Adaptive Responses in Female Fathead Minnows Exposed to an Aromatase Inhibitor

    Science.gov (United States)

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose response and time-course...

  1. Computational model of dose response for low-LET-induced complex chromosomal aberrations

    International Nuclear Information System (INIS)

    Eidelman, Y.A.; Andreev, S.G.

    2015-01-01

    Experiments with full-colour mFISH chromosome painting have revealed high yield of radiation-induced complex chromosomal aberrations (CAs). The ratio of complex to simple aberrations is dependent on cell type and linear energy transfer. Theoretical analysis has demonstrated that the mechanism of CA formation as a result of interaction between lesions at a surface of chromosome territories does not explain high complexes-to-simples ratio in human lymphocytes. The possible origin of high yields of γ-induced complex CAs was investigated in the present work by computer simulation. CAs were studied on the basis of chromosome structure and dynamics modelling and the hypothesis of CA formation on nuclear centres. The spatial organisation of all chromosomes in a human interphase nucleus was predicted by simulation of mitosis-to-interphase chromosome structure transition. Two scenarios of CA formation were analysed, 'static' (existing in a nucleus prior to irradiation) centres and 'dynamic' (formed in response to irradiation) centres. The modelling results reveal that under certain conditions, both scenarios explain quantitatively the dose-response relationships for both simple and complex γ-induced inter-chromosomal exchanges observed by mFISH chromosome painting in the first post-irradiation mitosis in human lymphocytes. (authors)

  2. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients

    DEFF Research Database (Denmark)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine

    2015-01-01

    BACKGROUND AND PURPOSE: This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. METHODS...... AND MATERIALS: Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT...

  3. A statistical mechanical approach for the computation of the climatic response to general forcings

    Directory of Open Access Journals (Sweden)

    V. Lucarini

    2011-01-01

    Full Text Available The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing

  4. Computation for LHC experiments: a worldwide computing grid

    International Nuclear Information System (INIS)

    Fairouz, Malek

    2010-01-01

    In normal operating conditions the LHC detectors are expected to record about 10 10 collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10 9 octets per second and recording capacity of a few tens of 10 15 octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  5. Clinical responses to ERK inhibition in BRAFV600E-mutant colorectal cancer predicted using a computational model.

    Science.gov (United States)

    Kirouac, Daniel C; Schaefer, Gabriele; Chan, Jocelyn; Merchant, Mark; Orr, Christine; Huang, Shih-Min A; Moffat, John; Liu, Lichuan; Gadkar, Kapil; Ramanujan, Saroja

    2017-01-01

    Approximately 10% of colorectal cancers harbor BRAF V600E mutations, which constitutively activate the MAPK signaling pathway. We sought to determine whether ERK inhibitor (GDC-0994)-containing regimens may be of clinical benefit to these patients based on data from in vitro (cell line) and in vivo (cell- and patient-derived xenograft) studies of cetuximab (EGFR), vemurafenib (BRAF), cobimetinib (MEK), and GDC-0994 (ERK) combinations. Preclinical data was used to develop a mechanism-based computational model linking cell surface receptor (EGFR) activation, the MAPK signaling pathway, and tumor growth. Clinical predictions of anti-tumor activity were enabled by the use of tumor response data from three Phase 1 clinical trials testing combinations of EGFR, BRAF, and MEK inhibitors. Simulated responses to GDC-0994 monotherapy (overall response rate = 17%) accurately predicted results from a Phase 1 clinical trial regarding the number of responding patients (2/18) and the distribution of tumor size changes ("waterfall plot"). Prospective simulations were then used to evaluate potential drug combinations and predictive biomarkers for increasing responsiveness to MEK/ERK inhibitors in these patients.

  6. Evaluating a Computer Flash-Card Sight-Word Recognition Intervention with Self-Determined Response Intervals in Elementary Students with Intellectual Disability

    Science.gov (United States)

    Cazzell, Samantha; Skinner, Christopher H.; Ciancio, Dennis; Aspiranti, Kathleen; Watson, Tiffany; Taylor, Kala; McCurdy, Merilee; Skinner, Amy

    2017-01-01

    A concurrent multiple-baseline across-tasks design was used to evaluate the effectiveness of a computer flash-card sight-word recognition intervention with elementary-school students with intellectual disability. This intervention allowed the participants to self-determine each response interval and resulted in both participants acquiring…

  7. MACKLIB-IV: a library of nuclear response functions generated with the MACK-IV computer program from ENDF/B-IV

    International Nuclear Information System (INIS)

    Gohar, Y.; Abdou, M.A.

    1978-03-01

    MACKLIB-IV employs the CTR energy group structure of 171 neutron groups and 36 gamma groups. A retrieval computer program is included with the library to permit collapsing into any other energy group structure. The library is in the new format of the ''MACK-Activity Table'' which uses a fixed position for each specific response function. This permits the user when employing the library with present transport codes to obtain directly the nuclear responses (e.g. the total nuclear heating) summed for all isotopes and integrated over any geometrical volume. The response functions included in the library are neutron kerma factor, gamma kerma factor, gas production and tritium-breeding functions, and all important reaction cross sections. Pertinent information about the library and a graphical display of six response functions for all materials in the library are given

  8. Behavioral response and pain perception to computer controlled local anesthetic delivery system and cartridge syringe

    Directory of Open Access Journals (Sweden)

    T D Yogesh Kumar

    2015-01-01

    Full Text Available Aim: The present study evaluated and compared the pain perception, behavioral response, physiological parameters, and the role of topical anesthetic administration during local anesthetic administration with cartridge syringe and computer controlled local anesthetic delivery system (CCLAD. Design: A randomized controlled crossover study was carried out with 120 children aged 7-11 years. They were randomly divided into Group A: Receiving injection with CCLAD during first visit; Group B: Receiving injection with cartridge syringe during first visit. They were further subdivided into three subgroups based on the topical application used: (a 20% benzocaine; (b pressure with cotton applicator; (c no topical application. Pulse rate and blood pressure were recorded before and during injection procedure. Objective evaluation of disruptive behavior and subjective evaluation of pain were done using face legs activity cry consolability scale and modified facial image scale, respectively. The washout period between the two visits was 1-week. Results: Injections with CCLAD produced significantly lesser pain response, disruptive behavior (P < 0.001, and pulse rate (P < 0.05 when compared to cartridge syringe injections. Application of benzocaine produced lesser pain response and disruptive behavior when compared to the other two subgroups, although the result was not significant. Conclusion: Usage of techniques which enhance behavioral response in children like injections with CCLAD can be considered as a possible step toward achieving a pain-free pediatric dental practice.

  9. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  10. Seismic Response Prediction of Buildings with Base Isolation Using Advanced Soft Computing Approaches

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available Modeling response of structures under seismic loads is an important factor in Civil Engineering as it crucially affects the design and management of structures, especially for the high-risk areas. In this study, novel applications of advanced soft computing techniques are utilized for predicting the behavior of centrically braced frame (CBF buildings with lead-rubber bearing (LRB isolation system under ground motion effects. These techniques include least square support vector machine (LSSVM, wavelet neural networks (WNN, and adaptive neurofuzzy inference system (ANFIS along with wavelet denoising. The simulation of a 2D frame model and eight ground motions are considered in this study to evaluate the prediction models. The comparison results indicate that the least square support vector machine is superior to other techniques in estimating the behavior of smart structures.

  11. A BENCHMARK PROGRAM FOR EVALUATION OF METHODS FOR COMPUTING SEISMIC RESPONSE OF COUPLED BUILDING-PIPING/EQUIPMENT WITH NON-CLASSICAL DAMPING

    International Nuclear Information System (INIS)

    Xu, J.; Degrassi, G.; Chokshi, N.

    2001-01-01

    Under the auspices of the US Nuclear Regulatory Commission (NRC), Brookhaven National Laboratory (BNL) developed a comprehensive program to evaluate state-of-the-art methods and computer programs for seismic analysis of typical coupled nuclear power plant (NPP) systems with nonclassical damping. In this program, four benchmark models of coupled building-piping/equipment systems with different damping characteristics were analyzed for a suite of earthquakes by program participants applying their uniquely developed methods and computer programs. This paper presents the results of their analyses, and their comparison to the benchmark solutions generated by BNL using time domain direct integration methods. The participant's analysis results established using complex modal time history methods showed good comparison with the BNL solutions, while the analyses produced with either complex-mode response spectrum methods or classical normal-mode response spectrum method, in general, produced more conservative results, when averaged over a suite of earthquakes. However, when coupling due to damping is significant, complex-mode response spectrum methods performed better than the classical normal-mode response spectrum method. Furthermore, as part of the program objectives, a parametric assessment is also presented in this paper, aimed at evaluation of the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled NPP systems. It is believed that the findings and insights learned from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems

  12. Correlation of Computed Tomography Imaging Features With Pain Response in Patients With Spine Metastases After Radiation Therapy

    International Nuclear Information System (INIS)

    Mitera, Gunita; Probyn, Linda; Ford, Michael; Donovan, Andrea; Rubenstein, Joel; Finkelstein, Joel; Christakis, Monique; Zhang, Liying; Campos, Sarah; Culleton, Shaelyn; Nguyen, Janet; Sahgal, Arjun; Barnes, Elizabeth; Tsao, May; Danjoux, Cyril; Holden, Lori; Yee, Albert; Khan, Luluel; Chow, Edward

    2011-01-01

    Purpose: To correlate computed tomography (CT) imaging features of spinal metastases with pain relief after radiotherapy (RT). Methods and Materials: Thirty-three patients receiving computed tomography (CT)-simulated RT for spinal metastases in an outpatient palliative RT clinic from January 2007 to October 2008 were retrospectively reviewed. Forty spinal metastases were evaluated. Pain response was rated using the International Bone Metastases Consensus Working Party endpoints. Three musculoskeletal radiologists and two orthopaedic surgeons evaluated CT features, including osseous and soft tissue tumor extent, presence of a pathologic fracture, severity of vertebral height loss, and presence of kyphosis. Results: The mean patient age was 69 years; 24 were men and 9 were women. The mean worst pain score was 7/10, and the mean total daily oral morphine equivalent was 77.3 mg. Treatment doses included 8 Gy in one fraction (22/33), 20 Gy in five fractions (10/33), and 20 Gy in eight fractions (1/33). The CT imaging appearance of spinal metastases included vertebral body involvement (40/40), pedicle involvement (23/40), and lamina involvement (18/40). Soft tissue component (10/40) and nerve root compression (9/40) were less common. Pathologic fractures existed in 11/40 lesions, with resultant vertebral body height loss in 10/40 and kyphosis in 2/40 lesions. At months 1, 2, and 3 after RT, 18%, 69%, and 70% of patients experienced pain relief. Pain response was observed with various CT imaging features. Conclusions: Pain response after RT did not differ in patients with and without pathologic fracture, kyphosis, or any other CT features related to extent of tumor involvement. All patients with painful spinal metastases may benefit from palliative RT.

  13. Computational methods for predicting the response of critical as-built infrastructure to dynamic loads (architectural surety)

    Energy Technology Data Exchange (ETDEWEB)

    Preece, D.S.; Weatherby, J.R.; Attaway, S.W.; Swegle, J.W.; Matalucci, R.V.

    1998-06-01

    Coupled blast-structural computational simulations using supercomputer capabilities will significantly advance the understanding of how complex structures respond under dynamic loads caused by explosives and earthquakes, an understanding with application to the surety of both federal and nonfederal buildings. Simulation of the effects of explosives on structures is a challenge because the explosive response can best be simulated using Eulerian computational techniques and structural behavior is best modeled using Lagrangian methods. Due to the different methodologies of the two computational techniques and code architecture requirements, they are usually implemented in different computer programs. Explosive and structure modeling in two different codes make it difficult or next to impossible to do coupled explosive/structure interaction simulations. Sandia National Laboratories has developed two techniques for solving this problem. The first is called Smoothed Particle Hydrodynamics (SPH), a relatively new gridless method comparable to Eulerian, that is especially suited for treating liquids and gases such as those produced by an explosive. The SPH capability has been fully implemented into the transient dynamics finite element (Lagrangian) codes PRONTO-2D and -3D. A PRONTO-3D/SPH simulation of the effect of a blast on a protective-wall barrier is presented in this paper. The second technique employed at Sandia National Laboratories uses a relatively new code called ALEGRA which is an ALE (Arbitrary Lagrangian-Eulerian) wave code with specific emphasis on large deformation and shock propagation. ALEGRA is capable of solving many shock-wave physics problems but it is especially suited for modeling problems involving the interaction of decoupled explosives with structures.

  14. Desk-top computer assisted processing of thermoluminescent dosimeters

    International Nuclear Information System (INIS)

    Archer, B.R.; Glaze, S.A.; North, L.B.; Bushong, S.C.

    1977-01-01

    An accurate dosimetric system utilizing a desk-top computer and high sensitivity ribbon type TLDs has been developed. The system incorporates an exposure history file and procedures designed for constant spatial orientation of each dosimeter. Processing of information is performed by two computer programs. The first calculates relative response factors to insure that the corrected response of each TLD is identical following a given dose of radiation. The second program computes a calibration factor and uses it and the relative response factor to determine the actual dose registered by each TLD. (U.K.)

  15. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  16. Cloud Computing for Science Data Processing in Support of Emergency Response

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing enables users to create virtual computers, each one with the optimal configuration of hardware and software for a job. The number of virtual...

  17. A computer tool for a minimax criterion in binary response and heteroscedastic simple linear regression models.

    Science.gov (United States)

    Casero-Alonso, V; López-Fidalgo, J; Torsney, B

    2017-01-01

    Binary response models are used in many real applications. For these models the Fisher information matrix (FIM) is proportional to the FIM of a weighted simple linear regression model. The same is also true when the weight function has a finite integral. Thus, optimal designs for one binary model are also optimal for the corresponding weighted linear regression model. The main objective of this paper is to provide a tool for the construction of MV-optimal designs, minimizing the maximum of the variances of the estimates, for a general design space. MV-optimality is a potentially difficult criterion because of its nondifferentiability at equal variance designs. A methodology for obtaining MV-optimal designs where the design space is a compact interval [a, b] will be given for several standard weight functions. The methodology will allow us to build a user-friendly computer tool based on Mathematica to compute MV-optimal designs. Some illustrative examples will show a representation of MV-optimal designs in the Euclidean plane, taking a and b as the axes. The applet will be explained using two relevant models. In the first one the case of a weighted linear regression model is considered, where the weight function is directly chosen from a typical family. In the second example a binary response model is assumed, where the probability of the outcome is given by a typical probability distribution. Practitioners can use the provided applet to identify the solution and to know the exact support points and design weights. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  19. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    Science.gov (United States)

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  20. Computer program for post-flight evaluation of the control surface response for an attitude controlled missile

    Science.gov (United States)

    Knauber, R. N.

    1982-01-01

    A FORTRAN IV coded computer program is presented for post-flight analysis of a missile's control surface response. It includes preprocessing of digitized telemetry data for time lags, biases, non-linear calibration changes and filtering. Measurements include autopilot attitude rate and displacement gyro output and four control surface deflections. Simple first order lags are assumed for the pitch, yaw and roll axes of control. Each actuator is also assumed to be represented by a first order lag. Mixing of pitch, yaw and roll commands to four control surfaces is assumed. A pseudo-inverse technique is used to obtain the pitch, yaw and roll components from the four measured deflections. This program has been used for over 10 years on the NASA/SCOUT launch vehicle for post-flight analysis and was helpful in detecting incipient actuator stall due to excessive hinge moments. The program is currently set up for a CDC CYBER 175 computer system. It requires 34K words of memory and contains 675 cards. A sample problem presented herein including the optional plotting requires eleven (11) seconds of central processor time.

  1. Variation in the human ribs geometrical properties and mechanical response based on X-ray computed tomography images resolution.

    Science.gov (United States)

    Perz, Rafał; Toczyski, Jacek; Subit, Damien

    2015-01-01

    Computational models of the human body are commonly used for injury prediction in automobile safety research. To create these models, the geometry of the human body is typically obtained from segmentation of medical images such as computed tomography (CT) images that have a resolution between 0.2 and 1mm/pixel. While the accuracy of the geometrical and structural information obtained from these images depend greatly on their resolution, the effect of image resolution on the estimation of the ribs geometrical properties has yet to be established. To do so, each of the thirty-four sections of ribs obtained from a Post Mortem Human Surrogate (PMHS) was imaged using three different CT modalities: standard clinical CT (clinCT), high resolution clinical CT (HRclinCT), and microCT. The images were processed to estimate the rib cross-section geometry and mechanical properties, and the results were compared to those obtained from the microCT images by computing the 'deviation factor', a metric that quantifies the relative difference between results obtained from clinCT and HRclinCT to those obtained from microCT. Overall, clinCT images gave a deviation greater than 100%, and were therefore deemed inadequate for the purpose of this study. HRclinCT overestimated the rib cross-sectional area by 7.6%, the moments of inertia by about 50%, and the cortical shell area by 40.2%, while underestimating the trabecular area by 14.7%. Next, a parametric analysis was performed to quantify how the variations in the estimate of the geometrical properties affected the rib predicted mechanical response under antero-posterior loading. A variation of up to 45% for the predicted peak force and up to 50% for the predicted stiffness was observed. These results provide a quantitative estimate of the sensitivity of the response of the FE model to the resolution of the images used to generate it. They also suggest that a correction factor could be derived from the comparison between microCT and

  2. X-Ray Computed Tomography Reveals the Response of Root System Architecture to Soil Texture1[OPEN

    Science.gov (United States)

    Rogers, Eric D.; Monaenkova, Daria; Mijar, Medhavinee; Goldman, Daniel I.

    2016-01-01

    Root system architecture (RSA) impacts plant fitness and crop yield by facilitating efficient nutrient and water uptake from the soil. A better understanding of the effects of soil on RSA could improve crop productivity by matching roots to their soil environment. We used x-ray computed tomography to perform a detailed three-dimensional quantification of changes in rice (Oryza sativa) RSA in response to the physical properties of a granular substrate. We characterized the RSA of eight rice cultivars in five different growth substrates and determined that RSA is the result of interactions between genotype and growth environment. We identified cultivar-specific changes in RSA in response to changing growth substrate texture. The cultivar Azucena exhibited low RSA plasticity in all growth substrates, whereas cultivar Bala root depth was a function of soil hardness. Our imaging techniques provide a framework to study RSA in different growth environments, the results of which can be used to improve root traits with agronomic potential. PMID:27208237

  3. Integration of process computer systems to Cofrentes NPP

    International Nuclear Information System (INIS)

    Saettone Justo, A.; Pindado Andres, R.; Buedo Jimenez, J.L.; Jimenez Fernandez-Sesma, A.; Delgado Muelas, J.A.

    1997-01-01

    The existence of three different process computer systems in Cofrentes NPP and the ageing of two of them have led to the need for their integration into a single real time computer system, known as Integrated ERIS-Computer System (SIEC), which covers the functionality of the three systems: Process Computer (PC), Emergency Response Information System (ERIS) and Nuclear Calculation Computer (OCN). The paper describes the integration project developed, which has essentially consisted in the integration of PC, ERIS and OCN databases into a single database, the migration of programs from the old process computer into the new SIEC hardware-software platform and the installation of a communications programme to transmit all necessary data for OCN programs from the SIEC computer, which in the new configuration is responsible for managing the databases of the whole system. (Author)

  4. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. ROCKING. A computer program for seismic response analysis of radioactive materials transport AND/OR storage casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1995-11-01

    The computer program ROCKING has been developed for seismic response analysis, which includes rocking and sliding behavior, of radioactive materials transport and/or storage casks. Main features of ROCKING are as follows; (1) Cask is treated as a rigid body. (2) Rocking and sliding behavior are considered. (3) Impact forces are represented by the spring dashpot model located at impact points. (4) Friction force is calculated at interface between a cask and a floor. (5) Forces of wire ropes against tip-over work only as tensile loads. In the paper, the calculation model, the calculation equations, validity calculations and user's manual are shown. (author)

  7. The Diffraction Response Interpolation Method

    DEFF Research Database (Denmark)

    Jespersen, Søren Kragh; Wilhjelm, Jens Erik; Pedersen, Peder C.

    1998-01-01

    Computer modeling of the output voltage in a pulse-echo system is computationally very demanding, particularly whenconsidering reflector surfaces of arbitrary geometry. A new, efficient computational tool, the diffraction response interpolationmethod (DRIM), for modeling of reflectors in a fluid...... medium, is presented. The DRIM is based on the velocity potential impulseresponse method, adapted to pulse-echo applications by the use of acoustical reciprocity. Specifically, the DRIM operates bydividing the reflector surface into planar elements, finding the diffraction response at the corners...

  8. Application of a brain-computer interface for person authentication using EEG responses to photo stimuli.

    Science.gov (United States)

    Mu, Zhendong; Yin, Jinhai; Hu, Jianfeng

    2018-01-01

    In this paper, a person authentication system that can effectively identify individuals by generating unique electroencephalogram signal features in response to self-face and non-self-face photos is presented. In order to achieve a good stability performance, the sequence of self-face photo including first-occurrence position and non-first-occurrence position are taken into account in the serial occurrence of visual stimuli. In addition, a Fisher linear classification method and event-related potential technique for feature analysis is adapted to yield remarkably better outcomes than that by most of the existing methods in the field. The results have shown that the EEG-based person authentications via brain-computer interface can be considered as a suitable approach for biometric authentication system.

  9. Contempt-LT: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Wheat, L.L.; Wagner, R.J.; Niederauer, G.F.; Obenchain, C.F.

    1975-06-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air-vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different. CONTEMPT-LT can be used to model all current boiling water reactor pressure suppression systems, including containments with either vertical or horizontal vent systems. CONTEMPT-LT can also be used to model pressurized water reactor dry containments, subatmospheric containments, and dual volume containments with an annulus region, and can be used to describe containment responses in experimental containment systems. The program user defines which compartments are used, specifies input mass and energy additions, defines heat structure and leakage systems, and describes the time advancement and output control. CONTEMPT-LT source decks are available in double precision extended-binary-coded-decimal-interchange-code (EBCDIC) versions. Sample problems have been run on the IBM360/75 computer. (U.S.)

  10. Path-integral computation of superfluid densities

    International Nuclear Information System (INIS)

    Pollock, E.L.; Ceperley, D.M.

    1987-01-01

    The normal and superfluid densities are defined by the response of a liquid to sample boundary motion. The free-energy change due to uniform boundary motion can be calculated by path-integral methods from the distribution of the winding number of the paths around a periodic cell. This provides a conceptually and computationally simple way of calculating the superfluid density for any Bose system. The linear-response formulation relates the superfluid density to the momentum-density correlation function, which has a short-ranged part related to the normal density and, in the case of a superfluid, a long-ranged part whose strength is proportional to the superfluid density. These facts are discussed in the context of path-integral computations and demonstrated for liquid 4 He along the saturated vapor-pressure curve. Below the experimental superfluid transition temperature the computed superfluid fractions agree with the experimental values to within the statistical uncertainties of a few percent in the computations. The computed transition is broadened by finite-sample-size effects

  11. Computer Security: Security operations at CERN (4/4)

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Stefan Lueders, PhD, graduated from the Swiss Federal Institute of Technology in Zurich and joined CERN in 2002. Being initially developer of a common safety system used in all four experiments at the Large Hadron Collider, he gathered expertise in cyber-security issues of control systems. Consequently in 2004, he took over responsibilities in securing CERN's accelerator and infrastructure control systems against cyber-threats. Subsequently, he joined the CERN Computer Security Incident Response Team and is today heading this team as CERN's Computer Security Officer with the mandate to coordinate all aspects of CERN's computer security --- office computing security, computer centre security, GRID computing security and control system security --- whilst taking into account CERN's operational needs. Dr. Lueders has presented on these topics at many different occasions to international bodies, governments, and companies, and published several articles. With the prevalence of modern information technologies and...

  12. The basics of cloud computing understanding the fundamentals of cloud computing in theory and practice

    CERN Document Server

    Rountree, Derrick

    2013-01-01

    As part of the Syngress Basics series, The Basics of Cloud Computing provides readers with an overview of the cloud and how to implement cloud computing in their organizations. Cloud computing continues to grow in popularity, and while many people hear the term and use it in conversation, many are confused by it or unaware of what it really means. This book helps readers understand what the cloud is and how to work with it, even if it isn't a part of their day-to-day responsibility. Authors Derrick Rountree and Ileana Castrillo explains the concepts of cloud computing in prac

  13. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  14. Accommodation and convergence during sustained computer work.

    Science.gov (United States)

    Collier, Juanita D; Rosenfield, Mark

    2011-07-01

    With computer usage becoming almost universal in contemporary society, the reported prevalence of computer vision syndrome (CVS) is extremely high. However, the precise physiological mechanisms underlying CVS remain unclear. Although abnormal accommodation and vergence responses have been cited as being responsible for the symptoms produced, there is little objective evidence to support this claim. Accordingly, this study measured both of these oculomotor parameters during a sustained period of computer use. Subjects (N = 20) were required to read text aloud from a laptop computer at a viewing distance of 50 cm for a sustained 30-minute period through their habitual refractive correction. At 2-minute intervals, the accommodative response (AR) to the computer screen was measured objectively using a Grand Seiko WAM 5500 optometer (Grand Seiko, Hiroshima, Japan). Additionally, the vergence response was assessed by measuring the associated phoria (AP), i.e., prism to eliminate fixation disparity, using a customized fixation disparity target that appeared on the computer screen. Subjects were asked to rate the degree of difficulty of the reading task on a scale from 1 to 10. Mean accommodation and AP values during the task were 1.07 diopters and 0.74∆ base-in (BI), respectively. The mean discomfort score was 4.9. No significant changes in accommodation or vergence were observed during the course of the 30-minute test period. There was no significant difference in the AR as a function of subjective difficulty. However, the mean AP for the subjects who reported the least and greatest discomfort during the task was 1.55∆ BI and 0, respectively (P = 0.02). CVS, after 30 minutes was worse in subjects exhibiting zero fixation disparity when compared with those subjects having a BI AP but does not appear to be related to differences in accommodation. A slightly reduced vergence response increases subject comfort during the task. Copyright © 2011 American Optometric

  15. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  16. The quantitative assessment of peri-implant bone responses using histomorphometry and micro-computed tomography.

    Science.gov (United States)

    Schouten, Corinne; Meijer, Gert J; van den Beucken, Jeroen J J P; Spauwen, Paul H M; Jansen, John A

    2009-09-01

    In the present study, the effects of implant design and surface properties on peri-implant bone response were evaluated with both conventional histomorphometry and micro-computed tomography (micro-CT), using two geometrically different dental implants (Screw type, St; Push-in, Pi) either or not surface-modified (non-coated, CaP-coated, or CaP-coated+TGF-beta1). After 12 weeks of implantation in a goat femoral condyle model, peri-implant bone response was evaluated in three different zones (inner: 0-500 microm; middle: 500-1000 microm; and outer: 1000-1500 microm) around the implant. Results indicated superiority of conventional histomorphometry over micro-CT, as the latter is hampered by deficits in the discrimination at the implant/tissue interface. Beyond this interface, both analysis techniques can be regarded as complementary. Histomorphometrical analysis showed an overall higher bone volume around St compared to Pi implants, but no effects of surface modification were observed. St implants showed lowest bone volumes in the outer zone, whereas inner zones were lowest for Pi implants. These results implicate that for Pi implants bone formation started from two different directions (contact- and distance osteogenesis). For St implants it was concluded that undersized implantation technique and loosening of bone fragments compress the zones for contact and distant osteogenesis, thereby improving bone volume at the interface significantly.

  17. Fast computation of Krawtchouk moments

    Czech Academy of Sciences Publication Activity Database

    Honarvar Shakibaei Asli, B.; Flusser, Jan

    2014-01-01

    Roč. 288, č. 1 (2014), s. 73-86 ISSN 0020-0255 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Krawtchouk polynomial * Krawtchouk moment * Geometric moment * Impulse response * Fast computation * Digital filter Subject RIV: JD - Computer Applications, Robotics Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0432452.pdf

  18. Computer-Aided Instruction in Automated Instrumentation.

    Science.gov (United States)

    Stephenson, David T.

    1986-01-01

    Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…

  19. Low-complexity computer simulation of multichannel room impulse responses

    NARCIS (Netherlands)

    Martínez Castañeda, J.A.

    2013-01-01

    The "telephone'' model has been, for the last one hundred thirty years, the base of modern telecommunications with virtually no changes in its fundamental concept. The arise of smaller and more powerful computing devices have opened new possibilities. For example, to build systems able to give to

  20. A physics computing bureau

    CERN Document Server

    Laurikainen, P

    1975-01-01

    The author first reviews the services offered by the Bureau to the user community scattered over three separate physics departments and a theory research institute. Limited services are offered also to non- physics research in the University, in collaboration with the University Computing Center. The personnel is divided into operations sections responsible for the terminal and data archive management, punching and document services, etc. and into analysts sections with half a dozen full-time scientific programmers recruited among promising graduate level physics students, rather than computer scientists or mathematicians. Analysts are thus able not only to communicate with physicists but also to participate in research to some extent. Only more demanding program development tasks can be handled by the Bureau, most of the routine data processing is the users responsibility.

  1. The social impact of computers

    CERN Document Server

    Rosenberg, Richard S

    1992-01-01

    The Social Impact of Computers should be read as a guide to the social implications of current and future applications of computers. Among the basic themes presented are the following: the changing nature of work in response to technological innovation as well as the threat to jobs; personal freedom in the machine age as manifested by challenges to privacy, dignity, and work; the relationship between advances in computer and communications technology and the possibility of increased centralization of authority; and the emergence and influence of artificial intelligence and its role in decision

  2. Privacy and legal issues in cloud computing

    CERN Document Server

    Weber, Rolf H

    2015-01-01

    Adopting a multi-disciplinary and comparative approach, this book focuses on emerging and innovative attempts to tackle privacy and legal issues in cloud computing, such as personal data privacy, security and intellectual property protection. Leading international academics and practitioners in the fields of law and computer science examine the specific legal implications of cloud computing pertaining to jurisdiction, biomedical practice and information ownership. This collection offers original and critical responses to the rising challenges posed by cloud computing.

  3. Neural and cortisol responses during play with human and computer partners in children with autism

    Science.gov (United States)

    Edmiston, Elliot Kale; Merkle, Kristen

    2015-01-01

    Children with autism spectrum disorder (ASD) exhibit impairment in reciprocal social interactions, including play, which can manifest as failure to show social preference or discrimination between social and nonsocial stimuli. To explore mechanisms underlying these deficits, we collected salivary cortisol from 42 children 8–12 years with ASD or typical development during a playground interaction with a confederate child. Participants underwent functional MRI during a prisoner’s dilemma game requiring cooperation or defection with a human (confederate) or computer partner. Search region of interest analyses were based on previous research (e.g. insula, amygdala, temporal parietal junction—TPJ). There were significant group differences in neural activation based on partner and response pattern. When playing with a human partner, children with ASD showed limited engagement of a social salience brain circuit during defection. Reduced insula activation during defection in the ASD children relative to TD children, regardless of partner type, was also a prominent finding. Insula and TPJ BOLD during defection was also associated with stress responsivity and behavior in the ASD group under playground conditions. Children with ASD engage social salience networks less than TD children during conditions of social salience, supporting a fundamental disturbance of social engagement. PMID:25552572

  4. Considerations on command and response language features for a network of heterogeneous autonomous computers

    Science.gov (United States)

    Engelberg, N.; Shaw, C., III

    1984-01-01

    The design of a uniform command language to be used in a local area network of heterogeneous, autonomous nodes is considered. After examining the major characteristics of such a network, and after considering the profile of a scientist using the computers on the net as an investigative aid, a set of reasonable requirements for the command language are derived. Taking into account the possible inefficiencies in implementing a guest-layered network operating system and command language on a heterogeneous net, the authors examine command language naming, process/procedure invocation, parameter acquisition, help and response facilities, and other features found in single-node command languages, and conclude that some features may extend simply to the network case, others extend after some restrictions are imposed, and still others require modifications. In addition, it is noted that some requirements considered reasonable (user accounting reports, for example) demand further study before they can be efficiently implemented on a network of the sort described.

  5. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  6. Computational Modeling of Micrometastatic Breast Cancer Radiation Dose Response

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Daniel L.; Debeb, Bisrat G. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Thames, Howard D. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A., E-mail: wwoodward@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Morgan Welch Inflammatory Breast Cancer Research Program and Clinic, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2016-09-01

    Purpose: Prophylactic cranial irradiation (PCI) involves giving radiation to the entire brain with the goals of reducing the incidence of brain metastasis and improving overall survival. Experimentally, we have demonstrated that PCI prevents brain metastases in a breast cancer mouse model. We developed a computational model to expand on and aid in the interpretation of our experimental results. Methods and Materials: MATLAB was used to develop a computational model of brain metastasis and PCI in mice. Model input parameters were optimized such that the model output would match the experimental number of metastases per mouse from the unirradiated group. An independent in vivo–limiting dilution experiment was performed to validate the model. The effect of whole brain irradiation at different measurement points after tumor cells were injected was evaluated in terms of the incidence, number of metastases, and tumor burden and was then compared with the corresponding experimental data. Results: In the optimized model, the correlation between the number of metastases per mouse and the experimental fits was >95. Our attempt to validate the model with a limiting dilution assay produced 99.9% correlation with respect to the incidence of metastases. The model accurately predicted the effect of whole-brain irradiation given 3 weeks after cell injection but substantially underestimated its effect when delivered 5 days after cell injection. The model further demonstrated that delaying whole-brain irradiation until the development of gross disease introduces a dose threshold that must be reached before a reduction in incidence can be realized. Conclusions: Our computational model of mouse brain metastasis and PCI correlated strongly with our experiments with unirradiated mice. The results further suggest that early treatment of subclinical disease is more effective than irradiating established disease.

  7. Effective Response to Attacks On Department of Defense Computer Networks

    National Research Council Canada - National Science Library

    Shaha, Patrick

    2001-01-01

    .... For the Commanders-in-Chief (CINCs), computer networking has proven especially useful in maintaining contact and sharing data with elements forward deployed as well as with host nation governments and agencies...

  8. Computation of restoration of ligand response in the random kinetics of a prostate cancer cell signaling pathway.

    Science.gov (United States)

    Dana, Saswati; Nakakuki, Takashi; Hatakeyama, Mariko; Kimura, Shuhei; Raha, Soumyendu

    2011-01-01

    Mutation and/or dysfunction of signaling proteins in the mitogen activated protein kinase (MAPK) signal transduction pathway are frequently observed in various kinds of human cancer. Consistent with this fact, in the present study, we experimentally observe that the epidermal growth factor (EGF) induced activation profile of MAP kinase signaling is not straightforward dose-dependent in the PC3 prostate cancer cells. To find out what parameters and reactions in the pathway are involved in this departure from the normal dose-dependency, a model-based pathway analysis is performed. The pathway is mathematically modeled with 28 rate equations yielding those many ordinary differential equations (ODE) with kinetic rate constants that have been reported to take random values in the existing literature. This has led to us treating the ODE model of the pathways kinetics as a random differential equations (RDE) system in which the parameters are random variables. We show that our RDE model captures the uncertainty in the kinetic rate constants as seen in the behavior of the experimental data and more importantly, upon simulation, exhibits the abnormal EGF dose-dependency of the activation profile of MAP kinase signaling in PC3 prostate cancer cells. The most likely set of values of the kinetic rate constants obtained from fitting the RDE model into the experimental data is then used in a direct transcription based dynamic optimization method for computing the changes needed in these kinetic rate constant values for the restoration of the normal EGF dose response. The last computation identifies the parameters, i.e., the kinetic rate constants in the RDE model, that are the most sensitive to the change in the EGF dose response behavior in the PC3 prostate cancer cells. The reactions in which these most sensitive parameters participate emerge as candidate drug targets on the signaling pathway. 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Interactive visualization of Earth and Space Science computations

    Science.gov (United States)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  10. Quantifying fish swimming behavior in response to acute exposure of aqueous copper using computer assisted video and digital image analysis

    Science.gov (United States)

    Calfee, Robin D.; Puglis, Holly J.; Little, Edward E.; Brumbaugh, William G.; Mebane, Christopher A.

    2016-01-01

    Behavioral responses of aquatic organisms to environmental contaminants can be precursors of other effects such as survival, growth, or reproduction. However, these responses may be subtle, and measurement can be challenging. Using juvenile white sturgeon (Acipenser transmontanus) with copper exposures, this paper illustrates techniques used for quantifying behavioral responses using computer assisted video and digital image analysis. In previous studies severe impairments in swimming behavior were observed among early life stage white sturgeon during acute and chronic exposures to copper. Sturgeon behavior was rapidly impaired and to the extent that survival in the field would be jeopardized, as fish would be swept downstream, or readily captured by predators. The objectives of this investigation were to illustrate protocols to quantify swimming activity during a series of acute copper exposures to determine time to effect during early lifestage development, and to understand the significance of these responses relative to survival of these vulnerable early lifestage fish. With mortality being on a time continuum, determining when copper first affects swimming ability helps us to understand the implications for population level effects. The techniques used are readily adaptable to experimental designs with other organisms and stressors.

  11. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    Science.gov (United States)

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is…

  12. Computer Security: Introduction to information and computer security (1/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Sebastian Lopienski is CERN's Deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and maintains security tools for vulnerability assessment and intrusion detection; provides training and awareness raising; and does incident investigation and response. During his work at CERN since 2001, Sebastian has had various assignments, including designing and developing software to manage and support services hosted in the CERN Computer Centre; providing Central CVS Service for software projects at CERN; and development of applications for accelerator controls in Java. He graduated from the University of Warsaw (MSc in Computer Science) in 2002, and earned an MBA degree at the Enterprise Administration Institute in Aix-en-Provence and Haute Ecole de Gestion in Geneva in 2010. His professional interests include software and network security, distributed systems, and Web and mobile technologies. With the prevalence of modern information te...

  13. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  14. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  15. Normalization as a canonical neural computation

    Science.gov (United States)

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  16. Electrodermal Response in Gaming

    Directory of Open Access Journals (Sweden)

    J. Christopher Westland

    2011-01-01

    Full Text Available Steady improvements in technologies that measure human emotional response offer new possibilities for making computer games more immersive. This paper reviews the history of designs a particular branch of affective technologies that acquire electrodermal response readings from human subjects. Electrodermal response meters have gone through continual improvements to better measure these nervous responses, but still fall short of the capabilities of today's technology. Electrodermal response traditionally have been labor intensive. Protocols and transcription of subject responses were recorded on separate documents, forcing constant shifts of attention between scripts, electrodermal measuring devices and of observations and subject responses. These problems can be resolved by collecting more information and integrating it in a computer interface that is, by adding relevant sensors in addition to the basic electrodermal resistance reading to untangle (1 body resistance; (2 skin resistance; (3 grip movements; other (4 factors affecting the neural processing for regulation of the body. A device that solves these problems is presented and discussed. It is argued that the electrodermal response datastreams can be enriched through the use of added sensors and a digital acquisition and processing of information, which should further experimentation and use of the technology.

  17. Factors influencing exemplary science teachers' levels of computer use

    Science.gov (United States)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  18. Elaboration of a computer code for the solution of a two-dimensional two-energy group diffusion problem using the matrix response method

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.

    1980-12-01

    An analytical procedure to solve the neutron diffusion equation in two dimensions and two energy groups was developed. The response matrix method was used coupled with an expansion of the neutron flux in finite Fourier series. A computer code 'MRF2D' was elaborated to implement the above mentioned procedure for PWR reactor core calculations. Different core symmetry options are allowed by the code, which is also flexible enough to allow for improvements by means of algorithm optimization. The code performance was compared with a corner mesh finite difference code named TVEDIM by using a International Atomic Energy Agency (IAEA) standard problem. Computer processing time 12,7% smaller is required by the MRF2D code to reach the same precision on criticality eigenvalue. (Author) [pt

  19. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    Directory of Open Access Journals (Sweden)

    James M Leaming

    2013-05-01

    Full Text Available Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations’ ability to assess participants’ beliefs regarding their own hospitals’ mass casualty incident preparedness.Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155 were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees.Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities.Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. [West J Emerg Med. 2013;14(3:236–242.

  20. Quantum Computing and Second Quantization

    International Nuclear Information System (INIS)

    Makaruk, Hanna Ewa

    2017-01-01

    Quantum computers are by their nature many particle quantum systems. Both the many-particle arrangement and being quantum are necessary for the existence of the entangled states, which are responsible for the parallelism of the quantum computers. Second quantization is a very important approximate method of describing such systems. This lecture will present the general idea of the second quantization, and discuss shortly some of the most important formulations of second quantization.

  1. Distributed computing at the SSCL

    International Nuclear Information System (INIS)

    Cormell, L.; White, R.

    1993-05-01

    The rapid increase in the availability of high performance, cost- effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no linger provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory. In addition, a brief review of the future directions of commercial products for distributed computing and management will be given

  2. Distributed computing at the SSCL

    International Nuclear Information System (INIS)

    Cormell, L.R.; White, R.C.

    1994-01-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by discussing the approach taken at the Superconducting Super Collider Laboratory (SSCL). In addition, a brief review of the future directions of commercial products for distributed computing and management will be given

  3. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  4. Positron emission tomography-computed tomography standardized uptake values in clinical practice and assessing response to therapy.

    Science.gov (United States)

    Kinahan, Paul E; Fletcher, James W

    2010-12-01

    The use of standardized uptake values (SUVs) is now common place in clinical 2-deoxy-2-[(18)F] fluoro-D-glucose (FDG) position emission tomography-computed tomography oncology imaging and has a specific role in assessing patient response to cancer therapy. Ideally, the use of SUVs removes variability introduced by differences in patient size and the amount of injected FDG. However, in practice there are several sources of bias and variance that are introduced in the measurement of FDG uptake in tumors and also in the conversion of the image count data to SUVs. In this article the overall imaging process is reviewed and estimates of the magnitude of errors, where known, are given. Recommendations are provided for best practices in improving SUV accuracy. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Computer Games for the Math Achievement of Diverse Students

    Science.gov (United States)

    Kim, Sunha; Chang, Mido

    2010-01-01

    Although computer games as a way to improve students' learning have received attention by many educational researchers, no consensus has been reached on the effects of computer games on student achievement. Moreover, there is lack of empirical research on differential effects of computer games on diverse learners. In response, this study…

  6. Statistics of the Von Mises Stress Response For Structures Subjected To Random Excitations

    Directory of Open Access Journals (Sweden)

    Mu-Tsang Chen

    1998-01-01

    Full Text Available Finite element-based random vibration analysis is increasingly used in computer aided engineering software for computing statistics (e.g., root-mean-square value of structural responses such as displacements, stresses and strains. However, these statistics can often be computed only for Cartesian responses. For the design of metal structures, a failure criterion based on an equivalent stress response, commonly known as the von Mises stress, is more appropriate and often used. This paper presents an approach for computing the statistics of the von Mises stress response for structures subjected to random excitations. Random vibration analysis is first performed to compute covariance matrices of Cartesian stress responses. Monte Carlo simulation is then used to perform scatter and failure analyses using the von Mises stress response.

  7. Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis

    Directory of Open Access Journals (Sweden)

    F. Joos

    2013-03-01

    Full Text Available The responses of carbon dioxide (CO2 and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP and Global Temperature change Potential (GTP, to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%. The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence lies within the range of (68 to 117 × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and

  8. Evaluating tablet computers as a survey tool in rural communities.

    Science.gov (United States)

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  9. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  10. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  11. A Revolution in Information Technology - Cloud Computing

    OpenAIRE

    Divya BHATT

    2012-01-01

    What is the Internet? It is collection of “interconnected networks” represented as a Cloud in network diagrams and Cloud Computing is a metaphor for certain parts of the Internet. The IT enterprises and individuals are searching for a way to reduce the cost of computation, storage and communication. Cloud Computing is an Internet-based technology providing “On-Demand” solutions for addressing these scenarios that should be flexible enough for adaptation and responsive to requirements. The hug...

  12. Academic Training Lecture Regular Programme: Computer Security - Introduction to information and computer security (1/4)

    CERN Multimedia

    2012-01-01

    Computer Security: Introduction to information and computer security (1/4), by Sebastian Lopienski (CERN).   Monday, 21 May, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 31-3-004 - IT Auditorium ) Sebastian Lopienski is CERN's Deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and maintains security tools for vulnerability assessment and intrusion detection; provides training and awareness raising; and does incident investigation and response. During his work at CERN since 2001, Sebastian has had various assignments, including designing and developing software to manage and support services hosted in the CERN Computer Centre; providing Central CVS Service for software projects at CERN; and development of applications for accelerator controls in Java. He graduated from the University of Warsaw (MSc in Computer Science) in 2002, and earned an MBA degree at the Enterprise Administration Institute in Ai...

  13. Implantation of FRAPCON-2 code in HB computer

    International Nuclear Information System (INIS)

    Silva, C.F. da.

    1987-05-01

    The modifications carried out for implanting FRAPCON-2 computer code in the HB DPS-T7 computer are presented. The FRAPCON-2 code calculates thermo-mechanical response during long period of burnup in stationary state for fuel rods of PWR type reactors. (M.C.K.)

  14. A novel computational approach of image analysis to quantify behavioural response to heat shock in Chironomus Ramosus larvae (Diptera: Chironomidae

    Directory of Open Access Journals (Sweden)

    Bimalendu B. Nath

    2015-07-01

    Full Text Available All living cells respond to temperature stress through coordinated cellular, biochemical and molecular events known as “heat shock response” and its genetic basis has been found to be evolutionarily conserved. Despite marked advances in stress research, this ubiquitous heat shock response has never been analysed quantitatively at the whole organismal level using behavioural correlates. We have investigated behavioural response to heat shock in a tropical midge Chironomus ramosus Chaudhuri, Das and Sublette. The filter-feeding aquatic Chironomus larvae exhibit characteristic undulatory movement. This innate pattern of movement was taken as a behavioural parameter in the present study. We have developed a novel computer-aided image analysis tool “Chiro” for the quantification of behavioural responses to heat shock. Behavioural responses were quantified by recording the number of undulations performed by each larva per unit time at a given ambient temperature. Quantitative analysis of undulation frequency was carried out and this innate behavioural pattern was found to be modulated as a function of ambient temperature. Midge larvae are known to be bioindicators of aquatic environments. Therefore, the “Chiro” technique can be tested using other potential biomonitoring organisms obtained from natural aquatic habitats using undulatory motion as a behavioural parameter.

  15. Prediction of lung density changes after radiotherapy by cone beam computed tomography response markers and pre-treatment factors for non-small cell lung cancer patients.

    Science.gov (United States)

    Bernchou, Uffe; Hansen, Olfred; Schytte, Tine; Bertelsen, Anders; Hope, Andrew; Moseley, Douglas; Brink, Carsten

    2015-10-01

    This study investigates the ability of pre-treatment factors and response markers extracted from standard cone-beam computed tomography (CBCT) images to predict the lung density changes induced by radiotherapy for non-small cell lung cancer (NSCLC) patients. Density changes in follow-up computed tomography scans were evaluated for 135 NSCLC patients treated with radiotherapy. Early response markers were obtained by analysing changes in lung density in CBCT images acquired during the treatment course. The ability of pre-treatment factors and CBCT markers to predict lung density changes induced by radiotherapy was investigated. Age and CBCT markers extracted at 10th, 20th, and 30th treatment fraction significantly predicted lung density changes in a multivariable analysis, and a set of response models based on these parameters were established. The correlation coefficient for the models was 0.35, 0.35, and 0.39, when based on the markers obtained at the 10th, 20th, and 30th fraction, respectively. The study indicates that younger patients without lung tissue reactions early into their treatment course may have minimal radiation induced lung density increase at follow-up. Further investigations are needed to examine the ability of the models to identify patients with low risk of symptomatic toxicity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  17. Computer security engineering management

    International Nuclear Information System (INIS)

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system

  18. ATLAS Distributed Computing: Its Central Services core

    CERN Document Server

    Lee, Christopher Jon; The ATLAS collaboration

    2018-01-01

    The ATLAS Distributed Computing (ADC) Project is responsible for the off-line processing of data produced by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN. It facilitates data and workload management for ATLAS computing on the Worldwide LHC Computing Grid (WLCG). ADC Central Services operations (CSops)is a vital part of ADC, responsible for the deployment and configuration of services needed by ATLAS computing and operation of those services on CERN IT infrastructure, providing knowledge of CERN IT services to ATLAS service managers and developers, and supporting them in case of issues. Currently this entails the management of thirty seven different OpenStack projects, with more than five thousand cores allocated for these virtual machines, as well as overseeing the distribution of twenty nine petabytes of storage space in EOS for ATLAS. As the LHC begins to get ready for the next long shut-down, which will bring in many new upgrades to allow for more data to be captured by the on-line syste...

  19. Operational mesoscale atmospheric dispersion prediction using high performance parallel computing cluster for emergency response

    International Nuclear Information System (INIS)

    Srinivas, C.V.; Venkatesan, R.; Muralidharan, N.V.; Das, Someshwar; Dass, Hari; Eswara Kumar, P.

    2005-08-01

    An operational atmospheric dispersion prediction system is implemented on a cluster super computer for 'Online Emergency Response' for Kalpakkam nuclear site. The numerical system constitutes a parallel version of a nested grid meso-scale meteorological model MM5 coupled to a random walk particle dispersion model FLEXPART. The system provides 48 hour forecast of the local weather and radioactive plume dispersion due to hypothetical air borne releases in a range of 100 km around the site. The parallel code was implemented on different cluster configurations like distributed and shared memory systems. Results of MM5 run time performance for 1-day prediction are reported on all the machines available for testing. A reduction of 5 times in runtime is achieved using 9 dual Xeon nodes (18 physical/36 logical processors) compared to a single node sequential run. Based on the above run time results a cluster computer facility with 9-node Dual Xeon is commissioned at IGCAR for model operation. The run time of a triple nested domain MM5 is about 4 h for 24 h forecast. The system has been operated continuously for a few months and results were ported on the IMSc home page. Initial and periodic boundary condition data for MM5 are provided by NCMRWF, New Delhi. An alternative source is found to be NCEP, USA. These two sources provide the input data to the operational models at different spatial and temporal resolutions and using different assimilation methods. A comparative study on the results of forecast is presented using these two data sources for present operational use. Slight improvement is noticed in rainfall, winds, geopotential heights and the vertical atmospheric structure while using NCEP data probably because of its high spatial and temporal resolution. (author)

  20. Computer Use by School Teachers in Teaching-Learning Process

    Science.gov (United States)

    Bhalla, Jyoti

    2013-01-01

    Developing countries have a responsibility not merely to provide computers for schools, but also to foster a habit of infusing a variety of ways in which computers can be integrated in teaching-learning amongst the end users of these tools. Earlier researches lacked a systematic study of the manner and the extent of computer-use by teachers. The…

  1. Brain-computer interfaces

    DEFF Research Database (Denmark)

    Treder, Matthias S.; Miklody, Daniel; Blankertz, Benjamin

    quality measure'. We were able to show that for stimuli close to the perceptual threshold, there was sometimes a discrepancy between overt responses and brain responses, shedding light on subjects using different response criteria (e.g., more liberal or more conservative). To conclude, brain-computer...... of perceptual and cognitive biases. Furthermore, subjects can only report on stimuli if they have a clear percept of them. On the other hand, the electroencephalogram (EEG), the electrical brain activity measured with electrodes on the scalp, is a more direct measure. It allows us to tap into the ongoing neural...... auditory processing stream. In particular, it can tap brain processes that are pre-conscious or even unconscious, such as the earliest brain responses to sounds stimuli in primary auditory cortex. In a series of studies, we used a machine learning approach to show that the EEG can accurately reflect...

  2. Optically Controlled Quantum Dot Spins for Scaleable Quantum Computing

    National Research Council Canada - National Science Library

    Steel, Duncan G

    2006-01-01

    .... Sham is responsible for theoretical support & concept development. The group at Michigan along with this QuaCGR student are responsible for experimental demonstration of key experimental demonstrations for quantum computing...

  3. Designing a responsive web site

    OpenAIRE

    Fejzić , Diana

    2016-01-01

    Due to the increasing prevalence of smartphones and tablet computers design became a crucial part of web design. For a user, responsive web design enables the best user experience, regardless of whether a user is visiting the site via a mobile phone, a tablet or a computer. This thesis covers the process of planning, designing and responsive web site development, for a fictitious company named “Creative Design d.o.o.”, with the help of web technologies. In the initial part of the thesis, w...

  4. Computer Modeling of Thoracic Response to Blast

    Science.gov (United States)

    1988-01-01

    be solved at reasonable cost. intrathoracic pressure responses for subjects wearing In order to determine if the gas content of the sheep ballistic...spatial and temporal ries were compared with data. Two extreme cases had distribution of the load can be reasonably predicted by the rumen filled with...to the ap- is that sheep have large, multiple stomachs that have a proximate location where intrathoracic pressure meas- considerable air content . It

  5. Simulation of salt behavior using in situ response

    International Nuclear Information System (INIS)

    Li, W.T.

    1986-01-01

    The time-dependent nonlinear structural behavior in a salt formation around the openings can be obtained by either performing computational analysis of measuring in situ responses. However, analysis using laboratory test data may often deviate from the actual in situ conditions and geomechanical instruments can provide information only up to the time when the measurements were taken. A method has been suggested for simulating the salt behavior by utilizing the steady-state portion of in situ response history. Governing equations for computational analysis were normalized to the creep constant, the equations were solved, and the analytical response history was then computed in terms of normalized time. By synchronizing the response history obtained from the analysis to the one measured at the site, the creep constant was determined. Then the structural response of the salt was computed. This paper presents an improved method for simulating the salt behavior. In this method, the governing equations are normalized to the creep function, which represents the transient and the steady-state creep behavior. Both the transient and the steady-state portions of in situ response history are used in determining the creep function. Finally, a nonlinear mapping process relating the normalized and real time domains determines the behavior of the salt

  6. Responsibility Towards The Customers Of Subscription-Based Software Solutions In The Context Of Using The Cloud Computing Technology

    Directory of Open Access Journals (Sweden)

    Bogdan Ștefan Ionescu

    2003-12-01

    Full Text Available The continuously transformation of the contemporary society and IT environment circumscribed its informational has led to the emergence of the cloud computing technology that provides the access to infrastructure and subscription-based software services, as well. In the context of a growing number of service providers with of cloud software, the paper aims to identify the perception of some current or potential users of the cloud solution, selected from among students enrolled in the accounting (professional or research master programs with the profile organized by the Bucharest University of Economic Studies, in terms of their expectations for cloud services, as well as the extent to which the SaaS providers are responsible for the provided services.

  7. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  8. Identifying a Computer Forensics Expert: A Study to Measure the Characteristics of Forensic Computer Examiners

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2010-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The usage of digital evidence from electronic devices has been rapidly expanding within litigation, and along with this increased usage, the reliance upon forensic computer examiners to acquire, analyze, and report upon this evidence is also rapidly growing. This growing demand for forensic computer examiners raises questions concerning the selection of individuals qualified to perform this work. While courts have mechanisms for qualifying witnesses that provide testimony based on scientific data, such as digital data, the qualifying criteria covers a wide variety of characteristics including, education, experience, training, professional certifications, or other special skills. In this study, we compare task performance responses from forensic computer examiners with an expert review panel and measure the relationship with the characteristics of the examiners to their quality responses. The results of this analysis provide insight into identifying forensic computer examiners that provide high-quality responses. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

  9. Vehicle - Bridge interaction, comparison of two computing models

    Science.gov (United States)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  10. Brushless DC motor control system responsive to control signals generated by a computer or the like

    Science.gov (United States)

    Packard, Douglas T. (Inventor); Schmitt, Donald E. (Inventor)

    1987-01-01

    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The rotor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor may be regulated by applying a separate control signal to each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  11. High-performance computing for structural mechanics and earthquake/tsunami engineering

    CERN Document Server

    Hori, Muneo; Ohsaki, Makoto

    2016-01-01

    Huge earthquakes and tsunamis have caused serious damage to important structures such as civil infrastructure elements, buildings and power plants around the globe.  To quantitatively evaluate such damage processes and to design effective prevention and mitigation measures, the latest high-performance computational mechanics technologies, which include telascale to petascale computers, can offer powerful tools. The phenomena covered in this book include seismic wave propagation in the crust and soil, seismic response of infrastructure elements such as tunnels considering soil-structure interactions, seismic response of high-rise buildings, seismic response of nuclear power plants, tsunami run-up over coastal towns and tsunami inundation considering fluid-structure interactions. The book provides all necessary information for addressing these phenomena, ranging from the fundamentals of high-performance computing for finite element methods, key algorithms of accurate dynamic structural analysis, fluid flows ...

  12. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  13. Computing shifts to monitor ATLAS distributed computing infrastructure and operations

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00068610; The ATLAS collaboration; Barberis, Dario; Crepe-Renaudin, Sabine Chrystel; De, Kaushik; Fassi, Farida; Stradling, Alden; Svatos, Michal; Vartapetian, Armen; Wolters, Helmut

    2017-01-01

    The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run 2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts’ workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run 1, this task was accomplished by a person of the expert team called the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run 2. The CRC position was proposed to cover some of the AMODs former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help with the training of future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing...

  14. Computing shifts to monitor ATLAS distributed computing infrastructure and operations

    CERN Document Server

    Adam Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts' workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run1, this task was accomplished by the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run2. The CRC position was proposed to cover some of the AMOD’s former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help train future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing ADC in relevant meetings. The CRC also facilitates ...

  15. Advanced Computational Modeling Approaches for Shock Response Prediction

    Science.gov (United States)

    Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee

    2015-01-01

    Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.

  16. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  17. Review your Computer Security Now and Frequently!

    CERN Multimedia

    IT Department

    2009-01-01

    The start-up of LHC is foreseen to take place in the autumn and we will be in the public spotlight again. This increases the necessity to be vigilant with respect to computer security and the defacement of an experiment’s Web page in September last year shows that we should be particularly attentive. Attackers are permanently probing CERN and so we must all do the maximum to reduce future risks. Security is a hierarchical responsibility and requires to balance the allocation of resources between making systems work and making them secure. Thus all of us, whether users, developers, system experts, administrators, or managers are responsible for securing our computing assets. These include computers, software applications, documents, accounts and passwords. There is no "silver bullet" for securing systems, which can only be achieved by a painstaking search for all possible vulnerabilities followed by their mitigation. Additional advice on particular topics can be obtained from the relevant I...

  18. Hyperswitch Network For Hypercube Computer

    Science.gov (United States)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  19. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  20. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  1. International Conference on Computational Engineering Science

    CERN Document Server

    Yagawa, G

    1988-01-01

    The aim of this Conference was to become a forum for discussion of both academic and industrial research in those areas of computational engineering science and mechanics which involve and enrich the rational application of computers, numerical methods, and mechanics, in modern technology. The papers presented at this Conference cover the following topics: Solid and Structural Mechanics, Constitutive Modelling, Inelastic and Finite Deformation Response, Transient Analysis, Structural Control and Optimization, Fracture Mechanics and Structural Integrity, Computational Fluid Dynamics, Compressible and Incompressible Flow, Aerodynamics, Transport Phenomena, Heat Transfer and Solidification, Electromagnetic Field, Related Soil Mechanics and MHD, Modern Variational Methods, Biomechanics, and Off-Shore-Structural Mechanics.

  2. Computational method for discovery of estrogen responsive genes

    DEFF Research Database (Denmark)

    Tang, Suisheng; Tan, Sin Lam; Ramadoss, Suresh Kumar

    2004-01-01

    Estrogen has a profound impact on human physiology and affects numerous genes. The classical estrogen reaction is mediated by its receptors (ERs), which bind to the estrogen response elements (EREs) in target gene's promoter region. Due to tedious and expensive experiments, a limited number of hu...

  3. Research on computer aided testing of pilot response to critical in-flight events

    Science.gov (United States)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  4. Analytical models of optical response in one-dimensional semiconductors

    International Nuclear Information System (INIS)

    Pedersen, Thomas Garm

    2015-01-01

    The quantum mechanical description of the optical properties of crystalline materials typically requires extensive numerical computation. Including excitonic and non-perturbative field effects adds to the complexity. In one dimension, however, the analysis simplifies and optical spectra can be computed exactly. In this paper, we apply the Wannier exciton formalism to derive analytical expressions for the optical response in four cases of increasing complexity. Thus, we start from free carriers and, in turn, switch on electrostatic fields and electron–hole attraction and, finally, analyze the combined influence of these effects. In addition, the optical response of impurity-localized excitons is discussed. - Highlights: • Optical response of one-dimensional semiconductors including excitons. • Analytical model of excitonic Franz–Keldysh effect. • Computation of optical response of impurity-localized excitons

  5. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  6. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  7. Reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2012-01-10

    Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.

  8. Structural-Vibration-Response Data Analysis

    Science.gov (United States)

    Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.

    1983-01-01

    Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.

  9. Performing stencil computations

    Energy Technology Data Exchange (ETDEWEB)

    Donofrio, David

    2018-01-16

    A method and apparatus for performing stencil computations efficiently are disclosed. In one embodiment, a processor receives an offset, and in response, retrieves a value from a memory via a single instruction, where the retrieving comprises: identifying, based on the offset, one of a plurality of registers of the processor; loading an address stored in the identified register; and retrieving from the memory the value at the address.

  10. CARS 2008: Computer Assisted Radiology and Surgery. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-06-15

    The proceedings contain contributions to the following topics: digital imaging, computed tomography, magnetic resonance, cardiac and vascular imaging, computer assisted radiation therapy, image processing and display, minimal invasive spinal surgery, computer assisted treatment of the prostate, the interventional radiology suite of the future, interventional oncology, computer assisted neurosurgery, computer assisted head and neck and ENT surgery, cardiovascular surgery, computer assisted orthopedic surgery, image processing and visualization, surgical robotics, instrumentation and navigation, surgical modelling, simulation and education, endoscopy and related techniques, workflow and new concepts in surgery, research training group 1126: intelligent surgery, digital operating room, image distribution and integration strategies, regional PACS and telemedicine, PACS - beyond radiology and E-learning, workflow and standardization, breast CAD, thoracic CAD, abdominal CAD, brain CAD, orthodontics, dentofacial orthopedics and airways, imaging and treating temporomandibular joint conditions, maxillofacial cone beam CT, craniomaxillofacial image fusion and CBCT incidental findings, image guided craniomaxillofacial procedures, imaging as a biomarker for therapy response, computer aided diagnosis. The Poster sessions cover the topics computer aided surgery, Euro PACS meeting, computer assisted radiology, computer aided diagnosis and computer assisted radiology and surgery.

  11. CARS 2008: Computer Assisted Radiology and Surgery. Proceedings

    International Nuclear Information System (INIS)

    2008-01-01

    The proceedings contain contributions to the following topics: digital imaging, computed tomography, magnetic resonance, cardiac and vascular imaging, computer assisted radiation therapy, image processing and display, minimal invasive spinal surgery, computer assisted treatment of the prostate, the interventional radiology suite of the future, interventional oncology, computer assisted neurosurgery, computer assisted head and neck and ENT surgery, cardiovascular surgery, computer assisted orthopedic surgery, image processing and visualization, surgical robotics, instrumentation and navigation, surgical modelling, simulation and education, endoscopy and related techniques, workflow and new concepts in surgery, research training group 1126: intelligent surgery, digital operating room, image distribution and integration strategies, regional PACS and telemedicine, PACS - beyond radiology and E-learning, workflow and standardization, breast CAD, thoracic CAD, abdominal CAD, brain CAD, orthodontics, dentofacial orthopedics and airways, imaging and treating temporomandibular joint conditions, maxillofacial cone beam CT, craniomaxillofacial image fusion and CBCT incidental findings, image guided craniomaxillofacial procedures, imaging as a biomarker for therapy response, computer aided diagnosis. The Poster sessions cover the topics computer aided surgery, Euro PACS meeting, computer assisted radiology, computer aided diagnosis and computer assisted radiology and surgery

  12. Computer networks and advanced communications

    International Nuclear Information System (INIS)

    Koederitz, W.L.; Macon, B.S.

    1992-01-01

    One of the major methods for getting the most productivity and benefits from computer usage is networking. However, for those who are contemplating a change from stand-alone computers to a network system, the investigation of actual networks in use presents a paradox: network systems can be highly productive and beneficial; at the same time, these networks can create many complex, frustrating problems. The issue becomes a question of whether the benefits of networking are worth the extra effort and cost. In response to this issue, the authors review in this paper the implementation and management of an actual network in the LSU Petroleum Engineering Department. The network, which has been in operation for four years, is large and diverse (50 computers, 2 sites, PC's, UNIX RISC workstations, etc.). The benefits, costs, and method of operation of this network will be described, and an effort will be made to objectively weigh these elements from the point of view of the average computer user

  13. SICOEM: emergency response data system

    International Nuclear Information System (INIS)

    Martin, A.; Villota, C.; Francia, L.

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig

  14. SICOEM: emergency response data system

    Energy Technology Data Exchange (ETDEWEB)

    Martin, A.; Villota, C.; Francia, L. (UNESA, Madrid (Spain))

    1993-01-01

    The main characteristics of the SICOEM emergency response system are: -direct electronic redundant transmission of certain operational parameters and plant status informations from the plant process computer to a computer at the Regulatory Body site, - the system will be used in emergency situations, -SICOEM is not considered as a safety class system. 1 fig.

  15. Development of a 3-dimensional seismic isolation floor for computer systems

    International Nuclear Information System (INIS)

    Kurihara, M.; Shigeta, M.; Nino, T.; Matsuki, T.

    1991-01-01

    In this paper, we investigated the applicability of a seismic isolation floor as a method for protecting computer systems from strong earthquakes, such as computer systems in nuclear power plants. Assuming that the computer system is guaranteed for 250 cm/s 2 of input acceleration in the horizontal and vertical directions as the seismic performance, the basic design specification of the seismic isolation floor is considered as follows. Against S 1 level earthquakes, the maximum acceleration response of the seismic isolation floor in the horizontal and vertical directions is kept less than 250 cm/s 2 to maintain continuous computer operation. Against S 2 level earthquakes, the isolation floor allows large horizontal movement and large displacement of the isolation devices to reduce the acceleration response, although it is not guaranteed to be less than 250 cm/s 2 . By reducing the acceleration response, however, serious damage to the computer systems is reduced, so that they can be restarted after an earthquake. Usually, seismic isolation floor systems permit 2-dimensional (horizontal) isolation. However, in the case of just-under-seated earthquakes, which have large vertical components, the vertical acceleration response of this system is amplified by the lateral vibration of the frame of the isolation floor. Therefore, in this study a 3-dimensional seismic isolation floor, including vertical isolation, was developed. This paper describes 1) the experimental results of the response characteristics of the 3-dimensional seismic isolation floor built as a trial using a 3-dimensional shaking table, and 2) comparison of a 2-dimensional analytical model, for motion in one horizontal direction and the vertical direction, to experimental results. (J.P.N.)

  16. Expert system technology to support emergency response: its prospects and limitations

    International Nuclear Information System (INIS)

    Belardo, S.; Wallace, W.A.

    1988-01-01

    The capabilities for computer technologies to provide decision support in emergency response are now well recognized. The information flow prior to, during, and after potentially catastrophic events must be managed in order to have effective response. We feel strongly that computer technology can be a crucial component in this management process. We will first review a relatively new facet of computer technology - expert systems. We will then provide a conceptual framework for decision making under crisis, a situation typified by emergency response. We follow with a discussion of a prototype expert system for response to an accident at a nuclear power generation facility. Our final section discusses the potential advantages and limitations of expert system technology in emergency response. (author)

  17. Response trees and expert systems for nuclear reactor operations

    International Nuclear Information System (INIS)

    Nelson, W.R.

    1984-02-01

    The United States Nuclear Regulatory Commission is sponsoring a project performed by EG and G Idaho, Inc., at the Idaho National Engineering Laboratory (INEL) to evaluate different display concepts for use in nuclear reactor control rooms. Included in this project is the evaluation of the response tree computer based decision aid and its associated displays. This report serves as an overview of the response tree methodology and how it has been implemented as a computer based decision aid utilizing color graphic displays. A qualitative assessment of the applicability of the response tree aid in the reactor control room is also made. Experience gained in evaluating the response tree aid is generalized to address a larger category of computer aids, those known as knowledge based expert systems. General characteristics of expert systems are discussed, as well as examples of their application in other domains. A survey of ongoing work on expert systems in the nuclear industry is presented, and an assessment of their potential applicability is made. Finally, recommendations for the design and evaluation of computer based decision aids are presented

  18. Programming of computers for the protection system for Savannah River reactors

    International Nuclear Information System (INIS)

    Finley, R.H.

    1977-06-01

    The monitoring requirements for the SRP Safety Computers are shown. These fast response times coupled with the large number of analog inputs to be scanned imposed stringent program requirements. The system consists of two separate computers, each with its own inputs to monitor half the reactor positions. Either computer can provide the minimum required monitoring. The desired redundant monitoring is provided when both computers are on-line. If both computers are off-line, the reactor is automatically shut down

  19. Experimental quantum computing without entanglement.

    Science.gov (United States)

    Lanyon, B P; Barbieri, M; Almeida, M P; White, A G

    2008-11-14

    Deterministic quantum computation with one pure qubit (DQC1) is an efficient model of computation that uses highly mixed states. Unlike pure-state models, its power is not derived from the generation of a large amount of entanglement. Instead it has been proposed that other nonclassical correlations are responsible for the computational speedup, and that these can be captured by the quantum discord. In this Letter we implement DQC1 in an all-optical architecture, and experimentally observe the generated correlations. We find no entanglement, but large amounts of quantum discord-except in three cases where an efficient classical simulation is always possible. Our results show that even fully separable, highly mixed, states can contain intrinsically quantum mechanical correlations and that these could offer a valuable resource for quantum information technologies.

  20. Computation-Guided Design of a Stimulus-Responsive Multienzyme Supramolecular Assembly.

    Science.gov (United States)

    Yang, Lu; Dolan, Elliott M; Tan, Sophia K; Lin, Tianyun; Sontag, Eduardo D; Khare, Sagar D

    2017-10-18

    The construction of stimulus-responsive supramolecular complexes of metabolic pathway enzymes, inspired by natural multienzyme assemblies (metabolons), provides an attractive avenue for efficient and spatiotemporally controllable one-pot biotransformations. We have constructed a phosphorylation- and optically responsive metabolon for the biodegradation of the environmental pollutant 1,2,3-trichloropropane. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. [Positron emission tomography combined with computed tomography in the initial evaluation and response assessment in primary central nervous system lymphoma].

    Science.gov (United States)

    Mercadal, Santiago; Cortés-Romera, Montserrat; Vélez, Patricia; Climent, Fina; Gámez, Cristina; González-Barca, Eva

    2015-06-08

    To evaluate the role of positron emission tomography combined with computed tomography (PET-CT) in the initial evaluation and response assessment in primary central nervous system lymphoma (PCNSL). Fourteen patients (8 males) with a median age 59.5 years diagnosed of PCNSL. A brain PET-CT and magnetic resonance imaging (MRI) were performed in the initial evaluation. In 7 patients a PET-CT after treatment was performed. PET-CT showed at diagnosis 31 hypermetabolic focuses and MRI showed 47 lesions, with a good grade of concordance between both (k = 0.61; P = .005). In the response assessment, correlation between both techniques was good, and PET-CT was helpful in the appreciation of residual MRI lesions. Overall survival at 2 years of negative vs. positive PET-CT at the end of treatment was 100 vs. 37.5%, respectively (P = .045). PET-CT can be useful in the initial evaluation of PCNSL, and especially in the assessment of response. Despite the fact that PET-CT detects less small lesions than MRI, a good correlation between MRI and PET-CT was observed. It is effective in the evaluation of residual lesions. Prospective studies are needed to confirm their possible prognostic value. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  2. Computation Directorate 2008 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  3. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  4. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  5. Artificial Intelligence and the Teaching of Reading and Writing by Computers.

    Science.gov (United States)

    Balajthy, Ernest

    1985-01-01

    Discusses how computers can "converse" with students for teaching purposes, demonstrates how these interactions are becoming more complex, and explains how the computer's role is becoming more "human" in giving intelligent responses to students. (HOD)

  6. A tactile P300 brain-computer interface

    NARCIS (Netherlands)

    Brouwer, A.M.; Erp, J.B.F. van

    2010-01-01

    De werking van de eerste Brain-Computer-Interface gebaseerd op tactiele EEG response wordt gedemonstreerd en het effect van het aantal gebruikte vibro-tactiele tactoren en stimulus-timing parameters wordt onderzocht

  7. Computer stress study of bone with computed tomography

    International Nuclear Information System (INIS)

    Linden, M.J.; Marom, S.A.; Linden, C.N.

    1986-01-01

    A computer processing tool has been developed which, together with a finite element program, determines the stress-deformation pattern in a long bone, utilizing Computed Tomography (CT) data files for the geometry and radiographic density information. The geometry, together with mechanical properties and boundary conditions: loads and displacements, comprise the input of the Finite element (FE) computer program. The output of the program is the stresses and deformations in the bone. The processor is capable of developing an accurate three-dimensional finite element model from a scanned human long bone due to the CT high pixel resolution and the local mechanical properties determined from the radiographic densities of the scanned bone. The processor, together with the finite element program, serves first as an analysis tool towards improved understanding of bone function and remodelling. In this first stage, actual long bones may be scanned and analyzed under applied loads and displacements, determined from existing gait analyses. The stress-deformation patterns thus obtained may be used for studying the biomechanical behavior of particular long bones such as bones with implants and with osteoporosis. As a second stage, this processor may serve as a diagnostic tool for analyzing the biomechanical response of a specific patient's long long bone under applied loading by utilizing a CT data file of the specific bone as an input to the processor with the FE program

  8. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  9. Texture analysis of advanced non-small cell lung cancer (NSCLC) on contrast-enhanced computed tomography: prediction of the response to the first-line chemotherapy

    International Nuclear Information System (INIS)

    Farina, Davide; Morassi, Mauro; Maroldi, Roberto; Roca, Elisa; Tassi, Gianfranco; Cavalleri, Giuseppe

    2013-01-01

    To assess whether tumour heterogeneity, quantified by texture analysis (TA) on contrast-enhanced computed tomography (CECT), can predict response to chemotherapy in advanced non-small cell lung cancer (NSCLC). Fifty-three CECT studies of patients with advanced NSCLC who had undergone first-line chemotherapy were retrospectively reviewed. Response to chemotherapy was evaluated according to RECIST1.1. Tumour uniformity was assessed by a TA method based on Laplacian of Gaussian filtering. The resulting parameters were correlated with treatment response and overall survival by multivariate analysis. Thirty-one out of 53 patients were non-responders and 22 were responders. Average overall survival was 13 months (4-35), minimum follow-up was 12 months. In the adenocarcinoma group (n = 31), the product of tumour uniformity and grey level (GL*U) was the unique independent variable correlating with treatment response. Dividing the GL*U (range 8.5-46.6) into tertiles, lesions belonging to the second and the third tertiles had an 8.3-fold higher probability of treatment response compared with those in the first tertile. No association between texture features and response to treatment was observed in the non-adenocarcinoma group (n = 22). GL*U did not correlate with overall survival. TA on CECT images in advanced lung adenocarcinoma provides an independent predictive indicator of response to first-line chemotherapy. (orig.)

  10. TEACHERS’ COMPUTER SELF-EFFICACY AND THEIR USE OF EDUCATIONAL TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Vehbi TUREL

    2014-10-01

    Full Text Available This study examined the use of educational technology by primary and subject teachers (i.e. secondary and high school teachers in a small town in the eastern part of Turkey in the spring of 2012. The study examined the primary, secondary and high school teachers’ Ø personal and computer related (demographic characteristics, Ø their computer self-efficacy perceptions, Ø their computer-using level in certain software, Ø their frequency of computer use for teaching, administrative and communication objectives, and Ø their use of educational technology preferences for preparation and teaching purposes. In this study, all primary, secondary and high school teachers in the small town were given the questionnaires to complete. 158 teachers (n=158 completed and returned them. The study was mostly quantitative and partly qualitative. The quantitative results were analysed with SPSS (i.e. mean, Std. Deviation, frequency, percentage, ANOVA. The qualitative data were analysed with examining the participants’ responses gathered from the open-ended questions and focussing on the shared themes among the responses. The results reveal that the teachers think that they have good computer self-efficacy perceptions, their level in certain programs is good, and they often use computers for a wide range of purposes. There are also statistical differences between; Ø their computer self-efficacy perceptions, Ø frequency of computer use for certain purposes, and Ø computer level in certain programs in terms of different independent variables.

  11. A prototype nuclear emergency response decision making expert system

    International Nuclear Information System (INIS)

    Chang, C.; Shih, C.; Hong, M.; Yu, W.; Su, M.; Wang, S.

    1990-01-01

    A prototype of emergency response expert system developed for nuclear power plants, has been fulfilled by Institute of Nuclear Energy Research. Key elements that have been implemented for emergency response include radioactive material dispersion assessment, dynamic transportation evacuation assessment, and meteorological parametric forecasting. A network system consists of five 80386 Personal Computers (PCs) has been installed to perform the system functions above. A further project is still continuing to achieve a more complicated and fanciful computer aid integral emergency response expert system

  12. Computational techniques in gamma-ray skyshine analysis

    International Nuclear Information System (INIS)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified to use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs

  13. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation

    Science.gov (United States)

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues—active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM50 and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  14. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    Science.gov (United States)

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  15. All-optical reservoir computer based on saturation of absorption.

    Science.gov (United States)

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  16. Computational description of nanocrystalline deformation based on crystal plasticity

    International Nuclear Information System (INIS)

    Fu, H.-H.; Benson, David J.; Andre Meyers, Marc

    2004-01-01

    The effect of grain size on the mechanical response of polycrystalline metals was investigated computationally and applied to the nanocrystalline domain. A phenomenological constitutive description is adopted to build the computational crystal model. Two approaches are implemented. In the first, the material is envisaged as a composite; the grain interior is modeled as a monocrystalline core surrounded by a mantle (grain boundary) with a lower yield stress and higher work hardening rate response. Both a quasi-isotropic and crystal plasticity approaches are used to simulate the grain interiors. The grain boundary is modeled either by an isotropic Voce equation (Model I) or by crystal plasticity (Model II). Elastic and plastic anisotropy are incorporated into this simulation. An implicit Eulerian finite element formulation with von Mises plasticity or rate dependent crystal plasticity is used to study the nonuniform deformation and localized plastic flow. The computational predictions are compared with the experimentally determined mechanical response of copper with grain sizes of 1 μm and 26 nm. Shear localization is observed during work hardening in view of the inhomogeneous mechanical response. In the second approach, the use of a continuous change in mechanical response, expressed by the magnitude of the maximum shear stress orientation gradient, is introduced. It is shown that the magnitude of the gradient is directly dependent on grain size. This gradient term is inserted into a constitutive equation that predicts the local stress-strain evolution

  17. A System Computational Model of Implicit Emotional Learning.

    Science.gov (United States)

    Puviani, Luca; Rama, Sidita

    2016-01-01

    Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.

  18. Volunteered Cloud Computing for Disaster Management

    Science.gov (United States)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects

  19. Instrument Formatting with Computer Data Entry in Mind.

    Science.gov (United States)

    Boser, Judith A.; And Others

    Different formats for four types of research items were studied for ease of computer data entry. The types were: (1) numeric response items; (2) individual multiple choice items; (3) multiple choice items with the same response items; and (4) card column indicator placement. Each of the 13 experienced staff members of a major university's Data…

  20. SCINFUL-QMD: Monte Carlo based computer code to calculate response function and detection efficiency of a liquid organic scintillator for neutron energies up to 3 GeV

    International Nuclear Information System (INIS)

    Satoh, Daiki; Sato, Tatsuhiko; Shigyo, Nobuhiro; Ishibashi, Kenji

    2006-11-01

    The Monte Carlo based computer code SCINFUL-QMD has been developed to evaluate response function and detection efficiency of a liquid organic scintillator for neutrons from 0.1 MeV to 3 GeV. This code is a modified version of SCINFUL that was developed at Oak Ridge National Laboratory in 1988, to provide a calculated full response anticipated for neutron interactions in a scintillator. The upper limit of the applicable energy was extended from 80 MeV to 3 GeV by introducing the quantum molecular dynamics incorporated with the statistical decay model (QMD+SDM) in the high-energy nuclear reaction part. The particles generated in QMD+SDM are neutron, proton, deuteron, triton, 3 He nucleus, alpha particle, and charged pion. Secondary reactions by neutron, proton, and pion inside the scintillator are also taken into account. With the extension of the applicable energy, the database of total cross sections for hydrogen and carbon nuclei were upgraded. This report describes the physical model, computational flow and how to use the code. (author)

  1. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  2. Do Interviewers' Health Beliefs and Habits Modify Responses to Sensitive Questions? A study using Data Collected from Pregnant women by Means of Computer-assisted Telephone Interviews

    DEFF Research Database (Denmark)

    Andersen, Anne-Marie Nybo; Olsen, Jørn

    2002-01-01

    If interviewers' personal habits or attitudes influence respondents' answers to given questions, this may lead to bias, which should be taken into consideration when analyzing data. The authors examined a potential interviewer effect in a study of pregnant women in which exposure data were obtained...... through computer-assisted telephone interviews. The authors compared interviewer characteristics for 34 interviewers with the responses they obtained in 12,910 interviews carried out for the Danish National Birth Cohort Study. Response data on smoking and alcohol consumption in the first trimester...... of pregnancy were collected during the time period October 1, 1997-February 1, 1999. Overall, the authors found little evidence to suggest that interviewers' personal habits or attitudes toward smoking and alcohol consumption during pregnancy had consequences for the responses they obtained; neither did...

  3. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  4. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  5. Computer skills for the next generation of healthcare executives.

    Science.gov (United States)

    Côté, Murray J; Van Enyde, Donald F; DelliFraine, Jami L; Tucker, Stephen L

    2005-01-01

    Students beginning a career in healthcare administration must possess an array of professional and management skills in addition to a strong fundamental understanding of the field of healthcare administration. Proficient computer skills are a prime example of an essential management tool for healthcare administrators. However, it is unclear which computer skills are absolutely necessary for healthcare administrators and the extent of congruency between the computer skills possessed by new graduates and the needs of senior healthcare professionals. Our objectives in this research are to assess which computer skills are the most important to senior healthcare executives and recent healthcare administration graduates and examine the level of agreement between the two groups. Based on a survey of senior healthcare executives and graduate healthcare administration students, we identify a comprehensive and pragmatic array of computer skills and categorize them into four groups, according to their importance, for making recent health administration graduates valuable in the healthcare administration workplace. Traditional parametric hypothesis tests are used to assess congruency between responses of senior executives and of recent healthcare administration graduates. For each skill, responses of the two groups are averaged to create an overall ranking of the computer skills. Not surprisingly, both groups agreed on the importance of computer skills for recent healthcare administration graduates. In particular, computer skills such as word processing, graphics and presentation, using operating systems, creating and editing databases, spreadsheet analysis, using imported data, e-mail, using electronic bulletin boards, and downloading information were among the highest ranked computer skills necessary for recent graduates. However, there were statistically significant differences in perceptions between senior executives and healthcare administration students as to the extent

  6. IGMtransmission: Transmission curve computation

    Science.gov (United States)

    Harrison, Christopher M.; Meiksin, Avery; Stock, David

    2015-04-01

    IGMtransmission is a Java graphical user interface that implements Monte Carlo simulations to compute the corrections to colors of high-redshift galaxies due to intergalactic attenuation based on current models of the Intergalactic Medium. The effects of absorption due to neutral hydrogen are considered, with particular attention to the stochastic effects of Lyman Limit Systems. Attenuation curves are produced, as well as colors for a wide range of filter responses and model galaxy spectra. Photometric filters are included for the Hubble Space Telescope, the Keck telescope, the Mt. Palomar 200-inch, the SUBARU telescope and UKIRT; alternative filter response curves and spectra may be readily uploaded.

  7. Response matrix method for large LMFBR analysis

    International Nuclear Information System (INIS)

    King, M.J.

    1977-06-01

    The feasibility of using response matrix techniques for computational models of large LMFBRs is examined. Since finite-difference methods based on diffusion theory have generally found a place in fast-reactor codes, a brief review of their general matrix foundation is given first in order to contrast it to the general strategy of response matrix methods. Then, in order to present the general method of response matrix technique, two illustrative examples are given. Matrix algorithms arising in the application to large LMFBRs are discussed, and the potential of the response matrix method is explored for a variety of computational problems. Principal properties of the matrices involved are derived with a view to application of numerical methods of solution. The Jacobi iterative method as applied to the current-balance eigenvalue problem is discussed

  8. Computer controlled drifting of Si(Li) detectors

    International Nuclear Information System (INIS)

    Landis, D.A.; Wong, Y.K.; Walton, J.T.; Goulding, F.S.

    1989-01-01

    A relatively inexpensive computer-controlled system for performing the drift process used in fabricating Si(Li) detectors is described. The system employs a small computer to monitor the leakage current, applied voltage and temperature on eight individual drift stations. The associated computer program initializes the drift process, monitors the drift progress and then terminates the drift when an operator set drift time has elapsed. The improved control of the drift with this system has been well demonstrated over the past three years in the fabrication of a variety of Si(Li) detectors. A few representative system responses to detector behavior during the drift process are described

  9. The Effects of Computer-Assisted Feedback Strategies in Technology Education: A Comparison of Learning Outcomes

    Science.gov (United States)

    Adams, Ruifang Hope; Strickland, Jane

    2012-01-01

    This study investigated the effects of computer-assisted feedback strategies that have been utilized by university students in a technology education curriculum. Specifically, the study examined the effectiveness of the computer-assisted feedback strategy "Knowledge of Response feedback" (KOR), and the "Knowledge of Correct Responses feedback"…

  10. Effect of different reading interfaces and conditions on the accommodation response

    Directory of Open Access Journals (Sweden)

    Xiao-Feng Wang

    2016-04-01

    Full Text Available AIM:To compare the difference of accommodation response under the variety reading conditions including computer screen, mobile phone screen and printed texts. The investigation also included the accommodation response under these conditions with different distances, brightness, dynamic and static testing status. METHODS:Thirty volunteer subjects were included with normal vision function. The reading target on computer screen, mobile screen and paper were used, respectively. Grand Seiko WAM 5500 infrared automatic refractometer was applied to measure accommodation response. The influence of different reading conditions on accommodation was compared using variance analysis with SPSS17.0.RESULTS:Accommodation lag under the computer screen with high brightness was 0.52±0.24D, that under papers was 0.73±0.28D, that under mobile phone was 0.72±0.29D. Accommodation lag under the computer screen with high brightness was less than that under mobile phones and paper, the differences were statistically significant(PCONCLUSION:Accommodation lag under the computer screen with high brightness is relatively smaller than that under mobile phone or paper. There is no significant difference between those under phones and paper. With the brightness of computers in a certain range, there is no effect for accommodation response.

  11. Computer simulation of the hydroelastic response of a pressurized water reactor to a sudden depressurization

    International Nuclear Information System (INIS)

    Dienes, J.K.; Hirt, C.W.; Stein, L.R.

    1977-03-01

    A computer program is being developed to analyze the response of the core support barrel to a sudden loss of coolant in a pressurized water reactor. This program, SOLA-FLX, combines SOLA-DF, a two-dimensional, two-phase, hydrodynamic code with FLX, a finite-difference code that integrates the Timoshenko equations of elastic shell motion. The programs are coupled so that the shell motion determined by FLX is used as a boundary condition by SOLA. In turn, the pressure determined by SOLA is the forcing term that controls the shell motion. An axisymmetric version was first developed to provide a basis for comparing with a simple set of experiments and to serve as a test case for the more general, unsymmetric version. The unsymmetric version is currently under development. The report describes the hydrodynamic code, the symmetric shell code, the unsymmetric shell code, and the method of coupling. Test problems used to verify the shell codes and coupled codes are also reported. Work is continuing to verify both the symmetric and unsymmetric codes by making comparisons with experimental data and with theoretical test problems

  12. A Multiscale Computational Model of the Response of Swine Epidermis After Acute Irradiation

    Science.gov (United States)

    Hu, Shaowen; Cucinotta, Francis A.

    2012-01-01

    Radiation exposure from Solar Particle Events can lead to very high skin dose for astronauts on exploration missions outside the protection of the Earth s magnetic field [1]. Assessing the detrimental effects to human skin under such adverse conditions could be predicted by conducting territorial experiments on animal models. In this study we apply a computational approach to simulate the experimental data of the radiation response of swine epidermis, which is closely similar to human epidermis [2]. Incorporating experimentally measured histological and cell kinetic parameters into a multiscale tissue modeling framework, we obtain results of population kinetics and proliferation index comparable to unirradiated and acutely irradiated swine experiments [3]. It is noted the basal cell doubling time is 10 to 16 days in the intact population, but drops to 13.6 hr in the regenerating populations surviving irradiation. This complex 30-fold variation is proposed to be attributed to the shortening of the G1 phase duration. We investigate this radiation induced effect by considering at the sub-cellular level the expression and signaling of TGF-beta, as it is recognized as a key regulatory factor of tissue formation and wound healing [4]. This integrated model will allow us to test the validity of various basic biological rules at the cellular level and sub-cellular mechanisms by qualitatively comparing simulation results with published research, and should lead to a fuller understanding of the pathophysiological effects of ionizing radiation on the skin.

  13. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy.

    Science.gov (United States)

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, María J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2017-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Seminal parameters values in the fertile group were very far above from those of the patients, either before or after surgery. No significant improvement in the percentage normal sperm morphology (P = 0.10), sperm concentration (P = 0.52), total sperm count (P = 0.76), subjective motility (%) (P = 0.97) nor kinematics (P = 0.30) was observed after varicocelectomy when all groups were compared. Neither was significant improvement found in percentage normal sperm morphology (P = 0.91), sperm concentration (P = 0.10), total sperm count (P = 0.89) or percentage motility (P = 0.77) after varicocelectomy in paired comparisons of preoperative and postoperative data. Analysis of paired samples revealed that the total sperm count (P = 0.01) and most sperm kinetic parameters: curvilinear velocity (P = 0.002), straight-line velocity (P = 0.0004), average path velocity (P = 0.0005), linearity (P = 0.02), and wobble (P = 0.006) improved after surgery. CASA offers the potential for accurate quantitative assessment of each patient's response to varicocelectomy.

  14. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    Science.gov (United States)

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  15. On the computation of FRS

    International Nuclear Information System (INIS)

    Ciucchi, W.; Lazzeri, L.; Olivieri, M.

    1983-01-01

    The problem of the computation of FRS (floor response spectra) is considered and discussed; a procedure is used defining the dynamic input as its power spectral density. A numerical iterative method is used to compute it. Basically three ranges are found: a) light equipment: i.e. the cases in which the feedback action of the equipment on the structure is negligible b) intermediate range: i.e. the cases in which the modes of the structure are not changed, however their amplitude can be changed by the equipment action c) heavy equipments: i.e. the cases in which the modes of the structure are changed by the equipment reactions. (orig./HP)

  16. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  17. A note on probabilistic computation of earthquake response spectrum amplitudes

    International Nuclear Information System (INIS)

    Anderson, J.G.; Trifunac, M.D.

    1979-01-01

    This paper analyzes a method for computation of Pseudo Relative Velocity (PSV) spectrum and Absolute Acceleration (SA) spectrum so that the amplitudes and the shapes of these spectra reflect the geometrical characteristics of the seismic environment of the site. The estimated spectra also incorporate the geologic characteristics at the site, direction of ground motion and the probability of exceeding these motions. An example of applying this method in a realistic setting is presented and the uncertainties of the results are discussed. (Auth.)

  18. Modeling the Structural Response of Reinforced Glass Beams using an SLA Scheme

    NARCIS (Netherlands)

    Louter, P.C.; Graaf, van de Anne; Rots, J.G.; Bos, Freek; Louter, Pieter Christiaan; Veer, Fred

    2010-01-01

    This paper investigates whether a novel computational sequentially linear analysis (SLA) technique, which is especially developed for modeling brittle material response, is applicable for modeling the structural response of metal reinforced glass beams. To do so, computational SLA results are

  19. Structural optimization for nonlinear dynamic response

    DEFF Research Database (Denmark)

    Dou, Suguang; Strachan, B. Scott; Shaw, Steven W.

    2015-01-01

    by a single vibrating mode, or by a pair of internally resonant modes. The approach combines techniques from nonlinear dynamics, computational mechanics and optimization, and it allows one to relate the geometric and material properties of structural elements to terms in the normal form for a given resonance......Much is known about the nonlinear resonant response of mechanical systems, but methods for the systematic design of structures that optimize aspects of these responses have received little attention. Progress in this area is particularly important in the area of micro-systems, where nonlinear...... resonant behaviour is being used for a variety of applications in sensing and signal conditioning. In this work, we describe a computational method that provides a systematic means for manipulating and optimizing features of nonlinear resonant responses of mechanical structures that are described...

  20. Extension of a nonlinear systems theory to general-frequency unsteady transonic aerodynamic responses

    Science.gov (United States)

    Silva, Walter A.

    1993-01-01

    A methodology for modeling nonlinear unsteady aerodynamic responses, for subsequent use in aeroservoelastic analysis and design, using the Volterra-Wiener theory of nonlinear systems is presented. The methodology is extended to predict nonlinear unsteady aerodynamic responses of arbitrary frequency. The Volterra-Wiener theory uses multidimensional convolution integrals to predict the response of nonlinear systems to arbitrary inputs. The CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code is used to generate linear and nonlinear unit impulse responses that correspond to each of the integrals for a rectangular wing with a NACA 0012 section with pitch and plunge degrees of freedom. The computed kernels then are used to predict linear and nonlinear unsteady aerodynamic responses via convolution and compared to responses obtained using the CAP-TSD code directly. The results indicate that the approach can be used to predict linear unsteady aerodynamic responses exactly for any input amplitude or frequency at a significant cost savings. Convolution of the nonlinear terms results in nonlinear unsteady aerodynamic responses that compare reasonably well with those computed using the CAP-TSD code directly but at significant computational cost savings.

  1. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed; Meier, Stuart Kurt

    2013-01-01

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  2. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  3. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  4. Seismic Safety Margins Research Program (Phase I). Project IV. Structural building response; Structural Building Response Review

    International Nuclear Information System (INIS)

    Healey, J.J.; Wu, S.T.; Murga, M.

    1980-02-01

    As part of the Phase I effort of the Seismic Safety Margins Research Program (SSMRP) being performed by the University of California Lawrence Livermore Laboratory for the US Nuclear Regulatory Commission, the basic objective of Subtask IV.1 (Structural Building Response Review) is to review and summarize current methods and data pertaining to seismic response calculations particularly as they relate to the objectives of the SSMRP. This material forms one component in the development of the overall computational methodology involving state of the art computations including explicit consideration of uncertainty and aimed at ultimately deriving estimates of the probability of radioactive releases due to seismic effects on nuclear power plant facilities

  5. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    Science.gov (United States)

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on resident physicians. Further study is needed better understand how tablet computers and other mobile devices may assist in medical education and patient care.

  6. Collateral circulation on perfusion-computed tomography-source images predicts the response to stroke intravenous thrombolysis.

    Science.gov (United States)

    Calleja, A I; Cortijo, E; García-Bermejo, P; Gómez, R D; Pérez-Fernández, S; Del Monte, J M; Muñoz, M F; Fernández-Herranz, R; Arenillas, J F

    2013-05-01

    Perfusion-computed tomography-source images (PCT-SI) may allow a dynamic assessment of leptomeningeal collateral arteries (LMC) filling and emptying in middle cerebral artery (MCA) ischaemic stroke. We described a regional LMC scale on PCT-SI and hypothesized that a higher collateral score would predict a better response to intravenous (iv) thrombolysis. We studied consecutive ischaemic stroke patients with an acute MCA occlusion documented by transcranial Doppler/transcranial color-coded duplex, treated with iv thrombolysis who underwent PCT prior to treatment. Readers evaluated PCT-SI in a blinded fashion to assess LMC within the hypoperfused MCA territory. LMC scored as follows: 0, absence of vessels; 1, collateral supply filling ≤ 50%; 2, between> 50% and < 100%; 3, equal or more prominent when compared with the unaffected hemisphere. The scale was divided into good (scores 2-3) vs. poor (scores 0-1) collaterals. The predetermined primary end-point was a good 3-month functional outcome, while early neurological recovery, transcranial duplex-assessed 24-h MCA recanalization, 24-h hypodensity volume and hemorrhagic transformation were considered secondary end-points. Fifty-four patients were included (55.5% women, median NIHSS 10), and 4-13-23-14 patients had LMC score (LMCs) of 0-1-2-3, respectively. The probability of a good long-term outcome augmented gradually with increasing LMCs: (0) 0%; (1) 15.4%; (2) 65.2%; (3) 64.3%, P = 0.004. Good-LMCs was independently associated with a good outcome [OR 21.02 (95% CI 2.23-197.75), P = 0.008]. Patients with good LMCs had better early neurological recovery (P = 0.001), smaller hypodensity volumes (P < 0.001) and a clear trend towards a higher recanalization rate. A higher degree of LMC assessed by PCT-SI predicts good response to iv thrombolysis in MCA ischaemic stroke patients. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  7. AEROS: a real-time emergency response system for atmospheric releases of toxic material

    International Nuclear Information System (INIS)

    Nasstrom, J.S.; Greenly, G.D.

    1986-01-01

    The Atmospheric Release Advisory Capability (ARAC) at the Lawrence Livermore National Laboratory has developed a sophisticated computer-based real-time emergency response system for radiotoxic releases into the atmosphere. The ARAC Emergency Response Operating System (AEROS) has a centralized computer facility linked to remote site computers, meteorological towers, and meteorological data sources. The system supports certain fixed sites, but has the ability to respond to accidents at arbitrary locations. Product quality and response time are optimized by using complex three-dimensional dispersion models; extensive on-line data bases; automated data processing; and an efficient user interface, employing graphical computer displays and computer-displayed forms. Upon notification, the system automatically initiates a response to an emergency and proceeds through preliminary calculations, automatically processing accident information, meteorological data, and model parameters. The model calculations incorporate mass-consistent three-dimensional wind fields, terrain effects, and particle-in-cell diffusion. Model products are color images of dose or deposition contours overlaid on a base map

  8. Characteristics of the TRISTAN control computer network

    International Nuclear Information System (INIS)

    Kurokawa, Shinichi; Akiyama, Atsuyoshi; Katoh, Tadahiko; Kikutani, Eiji; Koiso, Haruyo; Oide, Katsunobu; Shinomoto, Manabu; Kurihara, Michio; Abe, Kenichi

    1986-01-01

    Twenty-four minicomputers forming an N-to-N token-ring network control the TRISTAN accelerator complex. The computers are linked by optical fiber cables with 10 Mbps transmission speed. The software system is based on NODAL, a multicomputer interpretive language developed at the CERN SPS. The high-level services offered to the users of the network are remote execution by the EXEC, EXEC-P and IMEX commands of NODAL and uniform file access throughout the system. The network software was designed to achieve the fast response of the EXEC command. The performance of the network is also reported. Tasks that overload the minicomputers are processed on the KEK central computers. One minicomputer in the network serves as a gateway to KEKNET, which connects the minicomputer network and the central computers. The communication with the central computers is managed within the framework of the KEK NODAL system. NODAL programs communicate with the central computers calling NODAL functions; functions for exchanging data between a data set on the central computers and a NODAL variable, submitting a batch job to the central computers, checking the status of the submitted job, etc. are prepared. (orig.)

  9. Symbolic computation of nonlinear wave interactions on MACSYMA

    International Nuclear Information System (INIS)

    Bers, A.; Kulp, J.L.; Karney, C.F.F.

    1976-01-01

    In this paper the use of a large symbolic computation system - MACSYMA - in determining approximate analytic expressions for the nonlinear coupling of waves in an anisotropic plasma is described. MACSYMA was used to implement the solutions of a fluid plasma model nonlinear partial differential equations by perturbation expansions and subsequent iterative analytic computations. By interacting with the details of the symbolic computation, the physical processes responsible for particular nonlinear wave interactions could be uncovered and appropriate approximations introduced so as to simplify the final analytic result. Details of the MACSYMA system and its use are discussed and illustrated. (Auth.)

  10. Computation for LHC experiments: a worldwide computing grid; Le calcul scientifique des experiences LHC: une grille de production mondiale

    Energy Technology Data Exchange (ETDEWEB)

    Fairouz, Malek [Universite Joseph-Fourier, LPSC, CNRS-IN2P3, Grenoble I, 38 (France)

    2010-08-15

    In normal operating conditions the LHC detectors are expected to record about 10{sup 10} collisions each year. The processing of all the consequent experimental data is a real computing challenge in terms of equipment, software and organization: it requires sustaining data flows of a few 10{sup 9} octets per second and recording capacity of a few tens of 10{sup 15} octets each year. In order to meet this challenge a computing network implying the dispatch and share of tasks, has been set. The W-LCG grid (World wide LHC computing grid) is made up of 4 tiers. Tiers 0 is the computer center in CERN, it is responsible for collecting and recording the raw data from the LHC detectors and to dispatch it to the 11 tiers 1. The tiers 1 is typically a national center, it is responsible for making a copy of the raw data and for processing it in order to recover relevant data with a physical meaning and to transfer the results to the 150 tiers 2. The tiers 2 is at the level of the Institute or laboratory, it is in charge of the final analysis of the data and of the production of the simulations. Tiers 3 are at the level of the laboratories, they provide a complementary and local resource to tiers 2 in terms of data analysis. (A.C.)

  11. Detection of advance item knowledge using response times in computer adaptive testing

    NARCIS (Netherlands)

    Meijer, R.R.; Sotaridona, Leonardo

    2006-01-01

    We propose a new method for detecting item preknowledge in a CAT based on an estimate of “effective response time” for each item. Effective response time is defined as the time required for an individual examinee to answer an item correctly. An unusually short response time relative to the expected

  12. RC Circuits: Some Computer-Interfaced Experiments.

    Science.gov (United States)

    Jolly, Pratibha; Verma, Mallika

    1994-01-01

    Describes a simple computer-interface experiment for recording the response of an RC network to an arbitrary input excitation. The setup is used to pose a variety of open-ended investigations in network modeling by varying the initial conditions, input signal waveform, and the circuit topology. (DDR)

  13. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  14. Correlated responses in tissue weights measured in vivo by computer tomography in Dorset Down sheep selected for lean tissue growth

    International Nuclear Information System (INIS)

    Nsoso, S.J.; Young, M.J.; Beatson, P.R.

    2003-01-01

    The aim of this study was to estimate correlated responses in lean, fat and bone weights in vivo in Dorset Down sheep selected for lean tissue growth. Over the period 1986-1992 inclusive, the lean tissue growth line had been selected using two economic indices for an increased aggregate breeding value incorporating predicted lean and fat weights with positive and negative economic weightings, respectively. The control line was selected for no change in lean tissue growth each year. Animals were born and run on pasture all year round. X-ray computer tomography was used to estimate the weights of lean, fat and bone in vivo in the 1994-born sheep, aged 265-274 days and selected randomly into 12 rams and 12 ewes from the selected line and 10 rams and 9 ewes from the control line. The lean tissue growth line had significantly greater responses in lean weight (+0.65 + 0.10 kg) and lean percentage (+1.19 + 0.17%) and significantly lesser fat weight (-0.36 + 0.08 kg) and fat percentage (-1.88 + 0.20%) compared to the control line. There was a significant increase in bone weight (+0.27 + 0.03 kg) and bone percentage (+0.69 + 0.09%) in the lean tissue growth line compared to the control line. Responses differed significantly between sexes of the lean tissue growth line, rams having a greater response in weight of lean (+1.22 + 0.20 vs. +0.08 + 0.22 kg) and bone (+0.45 + 0.06 vs. +0.09 + 0.07 kg), and a lesser response in weight of fat (-0.03 + 0.15 vs. -0.70 + 0.16 kg) than the ewes. Selection led to significant changes in lean (increase) and fat weights (decrease), and bone weight increased. Although responses in the lean tissue growth line differed significantly between sexes, there were confounding factors due to differences in management and lack of comparison at equal stage of development. Therefore, to assess real genetic differences further studies should be conducted taking these factors into consideration

  15. Data input from an analog-to-digital converter into the M-6000 computer

    International Nuclear Information System (INIS)

    Kalashnikov, A.M.; Sheremet'ev, A.K.

    1978-01-01

    A device for spectrometric data input from the ADC-4096 into the M-6000 computer memory operating in the information storage regime is described. The input device made on integrated circuits coordinates signal levels of the fast response analog-to-digital converter and computer with the help of resistors and inverters. Besides, the input forms a strobe to trigger an increment channel used to record information into the computer memory. The use of the input device permits to get rid of the intermediate information storage in the analyzer memory and ensures fast response of the devices

  16. Primary pulmonary lymphoma-role of fluoro-deoxy-glucose positron emission tomography-computed tomography in the initial staging and evaluating response to treatment - case reports and review of literature

    International Nuclear Information System (INIS)

    Agarwal, Krishan Kant; Dhanapathi, Halanaik; Nazar, Aftab Hasan; Kumar, Rakesh

    2016-01-01

    Primary pulmonary lymphoma (PPL) is an uncommon entity of non-Hodgkin lymphoma, which accounts for <1% of all cases of lymphoma. We present two rare cases of PPL of diffuse large B-cell lymphoma, which underwent 18 fluorine fluoro-deoxy-glucose positron emission tomography-computed tomography for initial staging and response evaluation after chemotherapy

  17. Reducing Conservatism of Analytic Transient Response Bounds via Shaping Filters

    Science.gov (United States)

    Kwan, Aiyueh; Bedrossian, Nazareth; Jan, Jiann-Woei; Grigoriadis, Karolos; Hua, Tuyen (Technical Monitor)

    1999-01-01

    Recent results show that the peak transient response of a linear system to bounded energy inputs can be computed using the energy-to-peak gain of the system. However, analytically computed peak response bound can be conservative for a class of class bounded energy signals, specifically pulse trains generated from jet firings encountered in space vehicles. In this paper, shaping filters are proposed as a Methodology to reduce the conservatism of peak response analytic bounds. This Methodology was applied to a realistic Space Station assembly operation subject to jet firings. The results indicate that shaping filters indeed reduce the predicted peak response bounds.

  18. Opinions on Computing Education in Korean K-12 System: Higher Education Perspective

    Science.gov (United States)

    Kim, Dae-Kyoo; Jeong, Dongwon; Lu, Lunjin; Debnath, Debatosh; Ming, Hua

    2015-01-01

    The need for computing education in the K-12 curriculum has grown globally. The Republic of Korea is not an exception. In response to the need, the Korean Ministry of Education has announced an outline for software-centric computing education in the K-12 system, which aims at enhancing the current computing education with software emphasis. In…

  19. High fidelity computational characterization of the mechanical response of thermally aged polycarbonate

    Science.gov (United States)

    Zhang, Zesheng; Zhang, Lili; Jasa, John; Li, Wenlong; Gazonas, George; Negahban, Mehrdad

    2017-07-01

    A representative all-atom molecular dynamics (MD) system of polycarbonate (PC) is built and conditioned to capture and predict the behaviours of PC in response to a broad range of thermo-mechanical loadings for various thermal aging. The PC system is constructed to have a distribution of molecular weights comparable to a widely used commercial PC (LEXAN 9034), and thermally conditioned to produce models for aged and unaged PC. The MD responses of these models are evaluated through comparisons to existing experimental results carried out at much lower loading rates, but done over a broad range of temperatures and loading modes. These experiments include monotonic extension/compression/shear, unilaterally and bilaterally confined compression, and load-reversal during shear. It is shown that the MD simulations show both qualitative and quantitative similarity with the experimental response. The quantitative similarity is evaluated by comparing the dilatational response under bilaterally confined compression, the shear flow viscosity and the equivalent yield stress. The consistency of the in silico response to real laboratory experiments strongly suggests that the current PC models are physically and mechanically relevant and potentially can be used to investigate thermo-mechanical response to loading conditions that would not easily be possible. These MD models may provide valuable insight into the molecular sources of certain observations, and could possibly offer new perspectives on how to develop constitutive models that are based on better understanding the response of PC under complex loadings. To this latter end, the models are used to predict the response of PC to complex loading modes that would normally be difficult to do or that include characteristics that would be difficult to measure. These include the responses of unaged and aged PC to unilaterally confined extension/compression, cyclic uniaxial/shear loadings, and saw-tooth extension/compression/shear.

  20. An Idle-State Detection Algorithm for SSVEP-Based Brain-Computer Interfaces Using a Maximum Evoked Response Spatial Filter.

    Science.gov (United States)

    Zhang, Dan; Huang, Bisheng; Wu, Wei; Li, Siliang

    2015-11-01

    Although accurate recognition of the idle state is essential for the application of brain-computer interfaces (BCIs) in real-world situations, it remains a challenging task due to the variability of the idle state. In this study, a novel algorithm was proposed for the idle state detection in a steady-state visual evoked potential (SSVEP)-based BCI. The proposed algorithm aims to solve the idle state detection problem by constructing a better model of the control states. For feature extraction, a maximum evoked response (MER) spatial filter was developed to extract neurophysiologically plausible SSVEP responses, by finding the combination of multi-channel electroencephalogram (EEG) signals that maximized the evoked responses while suppressing the unrelated background EEGs. The extracted SSVEP responses at the frequencies of both the attended and the unattended stimuli were then used to form feature vectors and a series of binary classifiers for recognition of each control state and the idle state were constructed. EEG data from nine subjects in a three-target SSVEP BCI experiment with a variety of idle state conditions were used to evaluate the proposed algorithm. Compared to the most popular canonical correlation analysis-based algorithm and the conventional power spectrum-based algorithm, the proposed algorithm outperformed them by achieving an offline control state classification accuracy of 88.0 ± 11.1% and idle state false positive rates (FPRs) ranging from 7.4 ± 5.6% to 14.2 ± 10.1%, depending on the specific idle state conditions. Moreover, the online simulation reported BCI performance close to practical use: 22.0 ± 2.9 out of the 24 control commands were correctly recognized and the FPRs achieved as low as approximately 0.5 event/min in the idle state conditions with eye open and 0.05 event/min in the idle state condition with eye closed. These results demonstrate the potential of the proposed algorithm for implementing practical SSVEP BCI systems.

  1. Abnormal response to mental stress in patients with Takotsubo cardiomyopathy detected by gated single photon emission computed tomography

    International Nuclear Information System (INIS)

    Sciagra, Roberto; Genovese, Sabrina; Pupi, Alberto; Parodi, Guido; Bellandi, Benedetta; Antoniucci, David; Del Pace, Stefano; Zampini, Linda; Gensini, Gian Franco

    2010-01-01

    Persistent abnormalities are usually not detected in patients with Takotsubo cardiomyopathy (TTC). Since sympathetically mediated myocardial damage has been proposed as a causative mechanism of TTC, we explored whether mental stress could evoke abnormalities in these patients. One month after an acute event, 22 patients fulfilling all TTC diagnostic criteria and 11 controls underwent resting and mental stress gated single photon emission computed tomography (SPECT). Perfusion, wall motion, transient ischaemic dilation (TID) and left ventricular (LV) ejection fraction (EF) were evaluated. None of the controls showed stress-induced abnormalities. Mental stress evoked regional changes (perfusion defects and/or wall motion abnormality) in 16 TTC subjects and global abnormalities (LVEF fall >5% and/or TID >1.10) in 13; 3 had a completely negative response. TID, delta LVEF and delta wall motion score were significantly different in TTC vs control patients: 1.08 ± 0.20 vs 0.95 ± 0.11 (p < 0.05), -1.7 ± 6% vs 4 ± 5% (p < 0.02) and 2.5 (0, 4.25) vs 0 (0, 0) (p < 0.002), respectively. Mental stress may evoke regional and/or global abnormalities in most TTC patients. The abnormal response to mental stress supports the role of sympathetic stimulation in TTC. Mental stress could thus be helpful for TTC evaluation. (orig.)

  2. Computer assisted procedure maintenance

    International Nuclear Information System (INIS)

    Bisio, R.; Hulsund, J. E.; Nilsen, S.

    2004-04-01

    The maintenance of operating procedures in a NPP is a tedious and complicated task. Through the whole life cycle of the procedures they will be dynamic, 'living' documents. Several aspects of the procedure must be considered in a revision process. Pertinent details and attributes of the procedure must be checked. An organizational structure must be created and responsibilities allotted for drafting, revising, reviewing and publishing procedures. Available powerful computer technology provides solutions within document management and computerisation of procedures. These solutions can also support the maintenance of procedures. Not all parts of the procedure life cycle are equally amenable to computerized support. This report looks at the procedure life cycle in todays NPPs and discusses the possibilities associated with introduction of computer technology to assist the maintenance of procedures. (Author)

  3. Radiogenomics and radiotherapy response modeling

    Science.gov (United States)

    El Naqa, Issam; Kerns, Sarah L.; Coates, James; Luo, Yi; Speers, Corey; West, Catharine M. L.; Rosenstein, Barry S.; Ten Haken, Randall K.

    2017-08-01

    Advances in patient-specific information and biotechnology have contributed to a new era of computational medicine. Radiogenomics has emerged as a new field that investigates the role of genetics in treatment response to radiation therapy. Radiation oncology is currently attempting to embrace these recent advances and add to its rich history by maintaining its prominent role as a quantitative leader in oncologic response modeling. Here, we provide an overview of radiogenomics starting with genotyping, data aggregation, and application of different modeling approaches based on modifying traditional radiobiological methods or application of advanced machine learning techniques. We highlight the current status and potential for this new field to reshape the landscape of outcome modeling in radiotherapy and drive future advances in computational oncology.

  4. Computing level-impulse responses of log-specified VAR systems

    NARCIS (Netherlands)

    Wieringa, J.E.; Horvath, C.

    2005-01-01

    Impulse response functions (IRFs) are often used to analyze the dynamic behavior of a vector autoregressive (VAR) system. In many applications of VAR modelling, the variables are log-transformed before the model is estimated. If this is the case, the results of the IRFs do not have a direct

  5. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  6. Second Annual AEC Scientific Computer Information Exhange Meeting. Proceedings of the technical program theme: computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Peskin,A.M.; Shimamoto, Y.

    1974-01-01

    The topic of computer graphics serves well to illustrate that AEC affiliated scientific computing installations are well represented in the forefront of computing science activities. The participant response to the technical program was overwhelming--both in number of contributions and quality of the work described. Session I, entitled Advanced Systems, contains presentations describing systems that contain features not generally found in graphics facilities. These features can be roughly classified as extensions of standard two-dimensional monochromatic imaging to higher dimensions including color and time as well as multidimensional metrics. Session II presents seven diverse applications ranging from high energy physics to medicine. Session III describes a number of important developments in establishing facilities, techniques and enhancements in the computer graphics area. Although an attempt was made to schedule as many of these worthwhile presentations as possible, it appeared impossible to do so given the scheduling constraints of the meeting. A number of prospective presenters 'came to the rescue' by graciously withdrawing from the sessions. Some of their abstracts have been included in the Proceedings.

  7. Babcock and Wilcox revisions to CONTEMPT, computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hsii, Y.H.

    1975-01-01

    The CONTEMPT computer program predicts the pressure-temperature response of a single-volume reactor building to a loss-of-coolant accident. The analytical model used for the program is described. CONTEMPT assumes that the loss-of-coolant accident can be separated into two phases; the primary system blowdown and reactor building pressurization. The results of the blowdown analysis serve as the boundary conditions and are input to the CONTEMPT program. Thus, the containment model is only concerned with the pressure and temperature in the reactor building and the temperature distribution through the reactor building structures. The program also calculates building leakage and the effects of engineered safety features such as reactor building sprays, decay heat coolers, sump coolers, etc. 11 references. (U.S.)

  8. Computer screens and brain cancer

    International Nuclear Information System (INIS)

    Wood, A.W.

    1995-01-01

    Australia, both in the media and at the federal government level, over possible links between screen-based computer use and cancer, brain tumour in particular. The screen emissions assumed to be the sources of the putative hazard are the magnetic fields responsible for horizontal and vertical scanning of the display. Time-varying fluctuations in these magnetic fields induce electrical current flows in exposed tissues. This paper estimates that the induced current densities in the brain of the computer user are up to 1 mA/m 2 (due to the vertical flyback). Corresponding values for other electrical appliances or installations are in general much less than this. The epidemiological literature shows no obvious signs of a sudden increase in brain tumour incidence, but the widespread use of computers is a relatively recent phenomenon. The occupational use of other equipment based on cathode ray tubes (such as TV repair) has a much longer history and has been statistically linked to brain tumour in some studies. A number of factors make this an unreliable indicator of the risk from computer screens, however. 42 refs., 3 tabs., 2 figs

  9. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  10. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; Berghaus, Frank; Brasolin, Franco; Cordeiro, Cristovao; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; Leblanc, Matthew Edgar; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing infrastructure as a service (IaaS) resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for ma...

  11. The Evolution of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Berghaus, Frank; Love, Peter; Leblanc, Matthew Edgar; Di Girolamo, Alessandro; Paterson, Michael; Gable, Ian; Sobie, Randall; Field, Laurence

    2015-01-01

    The ATLAS experiment has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This work will describe the overall evolution of cloud computing in ATLAS. The current status of the VM management systems used for harnessing IAAS resources will be discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, ...

  12. Computer users' risk factors for developing shoulder, elbow and back symptoms

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Søgaard, Karen; Strøyer, Jesper

    2004-01-01

    OBJECTIVES: This prospective study concentrated on determining factors of computer work that predict musculoskeletal symptoms in the shoulder, elbow, and low-back regions. METHODS: A questionnaire on ergonomics, work pauses, work techniques, and psychosocial and work factors was delivered to 5033......, and previous symptoms was a significant predictor for symptoms in all regions. Computer worktime and psychosocial dimensions were not significant predictors. CONCLUSIONS: Influence on work pauses, reduction of glare or reflection, and screen height are important factors in the design of future computer...... office workers at baseline in early 1999 (response rate 69%) and to 3361 respondents at the time of the follow-up in late 2000 (response rate 77%). An increased frequency or intensity of symptoms was the outcome variable, including only nonsymptomatic respondents from the baseline questionnaire (symptom...

  13. Numerical Computational Technique for Scattering from Underwater Objects

    OpenAIRE

    T. Ratna Mani; Raj Kumar; Odamapally Vijay Kumar

    2013-01-01

    This paper presents a computational technique for mono-static and bi-static scattering from underwater objects of different shape such as submarines. The scatter has been computed using finite element time domain (FETD) method, based on the superposition of reflections, from the different elements reaching the receiver at a particular instant in time. The results calculated by this method has been verified with the published results based on ramp response technique. An in-depth parametric s...

  14. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  15. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  16. Computed tomography in malignant primary bone tumours

    International Nuclear Information System (INIS)

    Kersjes, W.; Harder, T.; Haeffner, P.

    1990-01-01

    The importance of computed tomography is examined in malignant primary bone tumours using a strongly defined examination group of 13 Patients (six Ewing's-sarcomas, five osteosarcomas, one chondrosarcoma and one spindle-shaped cell sarcoma). Computed tomography is judged superior compared to plain radiographs in recognition of bone marrow infiltration and presentation of parosteal tumour parts as well as in analysis of tissue components of tumours, CT is especially suitable for therapy planning and evaluating response to therapy. CT does not provide sufficient diagnostic information to determine dignity and exact diagnosis of bone tumours. (orig.) [de

  17. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings.

    Science.gov (United States)

    Revell, A D; Wang, D; Wood, R; Morrow, C; Tempelman, H; Hamers, R L; Alvarez-Uria, G; Streinu-Cercel, A; Ene, L; Wensing, A M J; DeWolf, F; Nelson, M; Montaner, J S; Lane, H C; Larder, B A

    2013-06-01

    Genotypic HIV drug-resistance testing is typically 60%-65% predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART without a genotype and evaluated their potential as a treatment support tool in RLSs. Random forest models were trained to predict the probability of response to ART (≤400 copies HIV RNA/mL) using the following data from 14 891 treatment change episodes (TCEs) after virological failure, from well-resourced countries: viral load and CD4 count prior to treatment change, treatment history, drugs in the new regimen, time to follow-up and follow-up viral load. Models were assessed by cross-validation during development, with an independent set of 800 cases from well-resourced countries, plus 231 cases from Southern Africa, 206 from India and 375 from Romania. The area under the receiver operating characteristic curve (AUC) was the main outcome measure. The models achieved an AUC of 0.74-0.81 during cross-validation and 0.76-0.77 with the 800 test TCEs. They achieved AUCs of 0.58-0.65 (Southern Africa), 0.63 (India) and 0.70 (Romania). Models were more accurate for data from the well-resourced countries than for cases from Southern Africa and India (P < 0.001), but not Romania. The models identified alternative, available drug regimens predicted to result in virological response for 94% of virological failures in Southern Africa, 99% of those in India and 93% of those in Romania. We developed computational models that predict virological response to ART without a genotype with comparable accuracy to genotyping with rule-based interpretation. These models have the potential to help optimize antiretroviral therapy for patients in RLSs where genotyping is not generally available.

  18. Fog Computing and Edge Computing Architectures for Processing Data From Diabetes Devices Connected to the Medical Internet of Things.

    Science.gov (United States)

    Klonoff, David C

    2017-07-01

    The Internet of Things (IoT) is generating an immense volume of data. With cloud computing, medical sensor and actuator data can be stored and analyzed remotely by distributed servers. The results can then be delivered via the Internet. The number of devices in IoT includes such wireless diabetes devices as blood glucose monitors, continuous glucose monitors, insulin pens, insulin pumps, and closed-loop systems. The cloud model for data storage and analysis is increasingly unable to process the data avalanche, and processing is being pushed out to the edge of the network closer to where the data-generating devices are. Fog computing and edge computing are two architectures for data handling that can offload data from the cloud, process it nearby the patient, and transmit information machine-to-machine or machine-to-human in milliseconds or seconds. Sensor data can be processed near the sensing and actuating devices with fog computing (with local nodes) and with edge computing (within the sensing devices). Compared to cloud computing, fog computing and edge computing offer five advantages: (1) greater data transmission speed, (2) less dependence on limited bandwidths, (3) greater privacy and security, (4) greater control over data generated in foreign countries where laws may limit use or permit unwanted governmental access, and (5) lower costs because more sensor-derived data are used locally and less data are transmitted remotely. Connected diabetes devices almost all use fog computing or edge computing because diabetes patients require a very rapid response to sensor input and cannot tolerate delays for cloud computing.

  19. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    Science.gov (United States)

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  20. Usage of super high speed computer for clarification of complex phenomena

    International Nuclear Information System (INIS)

    Sekiguchi, Tomotsugu; Sato, Mitsuhisa; Nakata, Hideki; Tatebe, Osami; Takagi, Hiromitsu

    1999-01-01

    This study aims at construction of an efficient super high speed computer system application environment in response to parallel distributed system with easy transplantation to different computer system and different number by conducting research and development on super high speed computer application technology required for elucidation of complicated phenomenon in elucidation of complicated phenomenon of nuclear power field due to computed scientific method. In order to realize such environment, the Electrotechnical Laboratory has conducted development on Ninf, a network numerical information library. This Ninf system can supply a global network infrastructure for worldwide computing with high performance on further wide range distributed network (G.K.)

  1. Fluid-Induced Vibration Analysis for Reactor Internals Using Computational FSI Method

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Jong Sung; Yi, Kun Woo; Sung, Ki Kwang; Im, In Young; Choi, Taek Sang [KEPCO E and C, Daejeon (Korea, Republic of)

    2013-10-15

    This paper introduces a fluid-induced vibration analysis method which calculates the response of the RVI to both deterministic and random loads at once and utilizes more realistic pressure distribution using the computational Fluid Structure Interaction (FSI) method. As addressed above, the FIV analysis for the RVI was carried out using the computational FSI method. This method calculates the response to deterministic and random turbulence loads at once. This method is also a simple and integrative method to get structural dynamic responses of reactor internals to various flow-induced loads. Because the analysis of this paper omitted the bypass flow region and Inner Barrel Assembly (IBA) due to the limitation of computer resources, it is necessary to find an effective way to consider all regions in the RV for the FIV analysis in the future. Reactor coolant flow makes Reactor Vessel Internals (RVI) vibrate and may affect the structural integrity of them. U. S. NRC Regulatory Guide 1.20 requires the Comprehensive Vibration Assessment Program (CVAP) to verify the structural integrity of the RVI for Fluid-Induced Vibration (FIV). The hydraulic forces on the RVI of OPR1000 and APR1400 were computed from the hydraulic formulas and the CVAP measurements in Palo Verde Unit 1 and Yonggwang Unit 4 for the structural vibration analyses. In this method, the hydraulic forces were divided into deterministic and random turbulence loads and were used for the excitation forces of the separate structural analyses. These forces are applied to the finite element model and the responses to them were combined into the resultant stresses.

  2. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  3. CONTEMPT-LT/028: a computer program for predicting containment pressure-temperature response to a loss-of-coolant accident

    International Nuclear Information System (INIS)

    Hargroves, D.W.; Metcalfe, L.J.; Wheat, L.L.; Niederauer, G.F.; Obenchain, C.F.

    1979-03-01

    CONTEMPT-LT is a digital computer program, written in FORTRAN IV, developed to describe the long-term behavior of water-cooled nuclear reactor containment systems subjected to postulated loss-of-coolant accident (LOCA) conditions. The program calculates the time variation of compartment pressures, temperatures, mass and energy inventories, heat structure temperature distributions, and energy exchange with adjacent compartments. The program is capable of describing the effects of leakage on containment response. Models are provided to describe fan cooler and cooling spray engineered safety systems. An annular fan model is also provided to model pressure control in the annular region of dual containment systems. Up to four compartments can be modeled with CONTEMPT-LT, and any compartment except the reactor system may have both a liquid pool region and an air--vapor atmosphere region above the pool. Each region is assumed to have a uniform temperature, but the temperatures of the two regions may be different

  4. The basics of item response theory using R

    CERN Document Server

    Baker, Frank B

    2017-01-01

    This graduate-level textbook is a tutorial for item response theory that covers both the basics of item response theory and the use of R for preparing graphical presentation in writings about the theory. Item response theory has become one of the most powerful tools used in test construction, yet one of the barriers to learning and applying it is the considerable amount of sophisticated computational effort required to illustrate even the simplest concepts. This text provides the reader access to the basic concepts of item response theory freed of the tedious underlying calculations. It is intended for those who possess limited knowledge of educational measurement and psychometrics. Rather than presenting the full scope of item response theory, this textbook is concise and practical and presents basic concepts without becoming enmeshed in underlying mathematical and computational complexities. Clearly written text and succinct R code allow anyone familiar with statistical concepts to explore and apply item re...

  5. Numerical discrepancy between serial and MPI parallel computations

    Directory of Open Access Journals (Sweden)

    Sang Bong Lee

    2016-09-01

    Full Text Available Numerical simulations of 1D Burgers equation and 2D sloshing problem were carried out to study numerical discrepancy between serial and parallel computations. The numerical domain was decomposed into 2 and 4 subdomains for parallel computations with message passing interface. The numerical solution of Burgers equation disclosed that fully explicit boundary conditions used on subdomains of parallel computation was responsible for the numerical discrepancy of transient solution between serial and parallel computations. Two dimensional sloshing problems in a rectangular domain were solved using OpenFOAM. After a lapse of initial transient time sloshing patterns of water were significantly different in serial and parallel computations although the same numerical conditions were given. Based on the histograms of pressure measured at two points near the wall the statistical characteristics of numerical solution was not affected by the number of subdomains as much as the transient solution was dependent on the number of subdomains.

  6. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  7. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  8. Strengthening Capacity to Respond to Computer Security Incidents ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... in the form of spam, improper access to confidential data and cyber theft. ... These teams are usually known as computer security incident response teams ... regional capacity for preventing and responding to cyber security incidents in Latin ...

  9. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  10. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  11. Blinded prospective evaluation of computer-based mechanistic schizophrenia disease model for predicting drug response.

    Directory of Open Access Journals (Sweden)

    Hugo Geerts

    Full Text Available The tremendous advances in understanding the neurobiological circuits involved in schizophrenia have not translated into more effective treatments. An alternative strategy is to use a recently published 'Quantitative Systems Pharmacology' computer-based mechanistic disease model of cortical/subcortical and striatal circuits based upon preclinical physiology, human pathology and pharmacology. The physiology of 27 relevant dopamine, serotonin, acetylcholine, norepinephrine, gamma-aminobutyric acid (GABA and glutamate-mediated targets is calibrated using retrospective clinical data on 24 different antipsychotics. The model was challenged to predict quantitatively the clinical outcome in a blinded fashion of two experimental antipsychotic drugs; JNJ37822681, a highly selective low-affinity dopamine D(2 antagonist and ocaperidone, a very high affinity dopamine D(2 antagonist, using only pharmacology and human positron emission tomography (PET imaging data. The model correctly predicted the lower performance of JNJ37822681 on the positive and negative syndrome scale (PANSS total score and the higher extra-pyramidal symptom (EPS liability compared to olanzapine and the relative performance of ocaperidone against olanzapine, but did not predict the absolute PANSS total score outcome and EPS liability for ocaperidone, possibly due to placebo responses and EPS assessment methods. Because of its virtual nature, this modeling approach can support central nervous system research and development by accounting for unique human drug properties, such as human metabolites, exposure, genotypes and off-target effects and can be a helpful tool for drug discovery and development.

  12. Voice Response Systems Technology.

    Science.gov (United States)

    Gerald, Jeanette

    1984-01-01

    Examines two methods of generating synthetic speech in voice response systems, which allow computers to communicate in human terms (speech), using human interface devices (ears): phoneme and reconstructed voice systems. Considerations prior to implementation, current and potential applications, glossary, directory, and introduction to Input Output…

  13. Computational compliance criteria in water hammer modelling

    Science.gov (United States)

    Urbanowicz, Kamil

    2017-10-01

    Among many numerical methods (finite: difference, element, volume etc.) used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC) is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC), which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL) number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  14. Computer based training for oil spill management

    International Nuclear Information System (INIS)

    Goodman, R.

    1993-01-01

    Large oil spills are infrequent occurrences, which poses a particular problem for training oil spill response staff and for maintaining a high level of response readiness. Conventional training methods involve table-top simulations to develop tactical and strategic response skills and boom-deployment exercises to maintain operational readiness. Both forms of training are quite effective, but they are very time-consuming to organize, are expensive to conduct, and tend to become repetitious. To provide a variety of response experiences, a computer-based system of oil spill response training has been developed which can supplement a table-top training program. Using a graphic interface, a realistic and challenging computerized oil spill response simulation has been produced. Integral to the system is a program editing tool which allows the teacher to develop a custom training exercise for the area of interest to the student. 1 ref

  15. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  16. Assessment of health and economic effects by PM2.5 pollution in Beijing: a combined exposure-response and computable general equilibrium analysis.

    Science.gov (United States)

    Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun

    2016-12-01

    Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.

  17. Thinking Outside the Button Box: EMG as a Computer Input Device for Psychological Research

    Directory of Open Access Journals (Sweden)

    L. Elizabeth Crawford

    2017-07-01

    Full Text Available Experimental psychology research commonly has participants respond to stimuli by pressing buttons or keys. Standard computer input devices constrain the range of motoric responses participants can make, even as the field advances theory about the importance of the motor system in cognitive and social information processing. Here we describe an inexpensive way to use an electromyographic (EMG signal as a computer input device, enabling participants to control a computer by contracting muscles that are not usually used for that purpose, but which may be theoretically relevant. We tested this approach in a study of facial mimicry, a well-documented phenomenon in which viewing emotional faces elicits automatic activation of corresponding muscles in the face of the viewer. Participants viewed happy and angry faces and were instructed to indicate the emotion on each face as quickly as possible by either furrowing their brow or contracting their cheek. The mapping of motor response to judgment was counterbalanced, so that one block of trials required a congruent mapping (contract brow to respond “angry,” cheek to respond “happy” and the other block required an incongruent mapping (brow for “happy,” cheek for “angry”. EMG sensors placed over the left corrugator supercilii muscle and left zygomaticus major muscle fed readings of muscle activation to a microcontroller, which sent a response to a computer when activation reached a pre-determined threshold. Response times were faster when the motor-response mapping was congruent than when it was incongruent, extending prior studies on facial mimicry. We discuss further applications of the method for research that seeks to expand the range of human-computer interaction beyond the button box.

  18. The Interactive Computer: Authors and Readers Online.

    Science.gov (United States)

    Saccardi, Marianne

    1991-01-01

    Describes a computer-literature project for middle school and high school students that was developed through the Fairfield-Westchester Children's Reading Project (CT) to promote online discussions between students and authors. Classroom activities are described, project financing is discussed, and teacher responses that indicate positive effects…

  19. Evaluation of patient dose during computed tomography angiography

    International Nuclear Information System (INIS)

    Dafalla, Elamam Yagoob Taha

    2015-10-01

    Computed tomography (CT), is an x-ray procedure that generates high quality cross sectional images of the body, and by comparison to other radiological diagnosis, CT is responsible for higher doses to patients. The evaluation of patient dose from computed tomography for pulmonary examinations the CT is responsible for higher doses to patients. The radiation dose was measured in three hospitals in Khartoum State during March 2015-October 2015 using different CT modalities. The radiation dose was higher at Alzytouna hospital than Daralelaj hospital and Alatebaa hospital was lowest. In this study, the mean effective dose for first hospital was 23.83±3.93 mSv and the mean effective dose for second hospital was 8.94±1.64 mSv and the mean effective dose for third hospital was 2.96±0.79. (author)

  20. Merging Technology and Emotions: Introduction to Affective Computing.

    Science.gov (United States)

    Brigham, Tara J

    2017-01-01

    Affective computing technologies are designed to sense and respond based on human emotions. This technology allows a computer system to process the information gathered from various sensors to assess the emotional state of an individual. The system then offers a distinct response based on what it "felt." While this is completely unlike how most people interact with electronics today, this technology is likely to trickle into future everyday life. This column will explain what affective computing is, some of its benefits, and concerns with its adoption. It will also provide an overview of its implication in the library setting and offer selected examples of how and where it is currently being used.

  1. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  2. SMACS: a system of computer programs for probabilistic seismic analysis of structures and subsystems. Volume I. User's manual

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.; Bumpus, S.; Gerhard, M.A.

    1985-03-01

    The SMACS (Seismic Methodology Analysis Chain with Statistics) system of computer programs, one of the major computational tools of the Seismic Safety Margins Research Program (SSMRP), links the seismic input with the calculation of soil-structure interaction, major structure response, and subsystem response. The seismic input is defined by ensembles of acceleration time histories in three orthogonal directions. Soil-structure interaction and detailed structural response are then determined simultaneously, using the substructure approach to SSI as implemented in the CLASSI family of computer programs. The modus operandi of SMACS is to perform repeated deterministic analyses, each analysis simulating an earthquake occurrence. Parameter values for each simulation are sampled from assumed probability distributions according to a Latin hypercube experimental design. The user may specify values of the coefficients of variation (COV) for the distributions of the input variables. At the heart of the SMACS system is the computer program SMAX, which performs the repeated SSI response calculations for major structure and subsystem response. This report describes SMAX and the pre- and post-processor codes, used in conjunction with it, that comprise the SMACS system

  3. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  4. Proceedings of computational methods in materials science

    International Nuclear Information System (INIS)

    Mark, J.E. Glicksman, M.E.; Marsh, S.P.

    1992-01-01

    The Symposium on which this volume is based was conceived as a timely expression of some of the fast-paced developments occurring throughout materials science and engineering. It focuses particularly on those involving modern computational methods applied to model and predict the response of materials under a diverse range of physico-chemical conditions. The current easy access of many materials scientists in industry, government laboratories, and academe to high-performance computers has opened many new vistas for predicting the behavior of complex materials under realistic conditions. Some have even argued that modern computational methods in materials science and engineering are literally redefining the bounds of our knowledge from which we predict structure-property relationships, perhaps forever changing the historically descriptive character of the science and much of the engineering

  5. Digging deeper on "deep" learning: A computational ecology approach.

    Science.gov (United States)

    Buscema, Massimo; Sacco, Pier Luigi

    2017-01-01

    We propose an alternative approach to "deep" learning that is based on computational ecologies of structurally diverse artificial neural networks, and on dynamic associative memory responses to stimuli. Rather than focusing on massive computation of many different examples of a single situation, we opt for model-based learning and adaptive flexibility. Cross-fertilization of learning processes across multiple domains is the fundamental feature of human intelligence that must inform "new" artificial intelligence.

  6. Helix Nebula: sunshine and clouds on the CERN computing horizon

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    23 petabytes is how much data CERN recorded during 2011, and this number will rise in 2012. In order to respond to the challenge, the IT department is upping its game, amongst other things by participating in the Helix Nebula project, a public-private partnership to create a European cloud-computing platform, as announced in a recent CERN press release.   “We’re not replacing the Grid,” clarifies Bob Jones, responsible for CERN openlab who is also responsible for EC-funded projects in IT, “but looking at three complementary ways of increasing CERN’s computing capacity, so that as demand goes up we can continue to satisfy our users.” “First we are upgrading the electrical and cooling infrastructure of the computer centre in order to increase the availability of critical IT services needed for the Laboratory. This will also provide more floor space in the area called The Barn, allowing for more servers to fit in.”...

  7. Learning Style and Attitude toward Computer among Iranian Medical Students

    Directory of Open Access Journals (Sweden)

    Seyedeh Shohreh Alavi

    2016-02-01

    Full Text Available Background and purpose: Presently, the method of medical teaching has shifted from lecture-based to computer-based. The learning style may play a key role in the attitude toward learning computer. The goal of this study was to study the relationship between the learning style and attitude toward computer among Iranian medical students.Methods: This cross-sectional study included 400 medical students. Barsch learning style inventory and a questionnaire on the attitude toward computer was sent to each student. The enthusiasm, anxiety, and overall attitude toward computer were compared among the different learning styles.Results: The response rate to the questionnaire was 91.8%. The distribution of learning styles in the students was 181 (49.3% visual, 106 (28.9% auditory, 27 (7.4% tactual, and 53 (14.4% overall. Visual learners were less anxious for computer use and showed more positive attitude toward computer. Sex, age, and academic grade were not associated with students’ attitude toward computer.Conclusions: The learning style is an important factor in the students’ attitude toward computer among medical students, which should be considered in planning computer-based learning programs.Keywords: LEARNING STYLE, ATTITUDE, COMPUTER, MEDICAL STUDENT, ANXIETY, ENTHUSIASM

  8. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  9. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  10. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  11. The computer code EURDYN-1M (release 2). User's manual

    International Nuclear Information System (INIS)

    1982-01-01

    EURDYN-1M is a finite element computer code developed at J.R.C. Ispra to compute the response of two-dimensional coupled fluid-structure configurations to transient dynamic loading for reactor safety studies. This report gives instructions for preparing input data to EURDYN-1M, release 2, and describes a test problem in order to illustrate both the input and the output of the code

  12. MMPI Profiles and Code Types of Responsible and Non-Responsible Criminal Defendants.

    Science.gov (United States)

    Kurlychek, Robert T.; Jordan, L.

    1980-01-01

    Compared MMPI profiles and two-point code types of criminal defendants (N=50) pleading a defense of "not responsible due to mental disease or defect." A sign test was computed, treating the clinical scales as matched pairs, and a significant difference was found; the nonresponsible group profile was more elevated. (Author)

  13. Computer forensics an essential guide for accountants, lawyers, and managers

    CERN Document Server

    Sheetz, Michael

    2013-01-01

    Would your company be prepared in the event of: * Computer-driven espionage * A devastating virus attack * A hacker's unauthorized access * A breach of data security? As the sophistication of computer technology has grown, so has the rate of computer-related criminal activity. Subsequently, American corporations now lose billions of dollars a year to hacking, identity theft, and other computer attacks. More than ever, businesses and professionals responsible for the critical data of countless customers and employees need to anticipate and safeguard against computer intruders and attacks. The first book to successfully speak to the nontechnical professional in the fields of business and law on the topic of computer crime, Computer Forensics: An Essential Guide for Accountants, Lawyers, and Managers provides valuable advice on the hidden difficulties that can blindside companies and result in damaging costs. Written by industry expert Michael Sheetz, this important book provides readers with an honest look at t...

  14. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  15. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  16. Computational Pipeline for NIRS-EEG Joint Imaging of tDCS-Evoked Cerebral Responses-An Application in Ischemic Stroke.

    Science.gov (United States)

    Guhathakurta, Debarpan; Dutta, Anirban

    2016-01-01

    Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about -15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization.

  17. A Multi-agent Supply Chain Information Coordination Mode Based on Cloud Computing

    OpenAIRE

    Wuxue Jiang; Jing Zhang; Junhuai Li

    2013-01-01

     In order to improve the high efficiency and security of supply chain information coordination under cloud computing environment, this paper proposes a supply chain information coordination mode based on cloud computing. This mode has two basic statuses which are online status and offline status. At the online status, cloud computing center is responsible for coordinating the whole supply chain information. At the offline status, information exchange can be realized among different nodes by u...

  18. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    Science.gov (United States)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  19. Viable tumor volume: Volume of interest within segmented metastatic lesions, a pilot study of proposed computed tomography response criteria for urothelial cancer

    International Nuclear Information System (INIS)

    Folio, Les Roger; Turkbey, Evrim B.; Steinberg, Seth M.; Apolo, Andrea B.

    2015-01-01

    Highlights: • It is clear that 2D axial measurements are incomplete assessments in metastatic disease; especially in light of evolving antiangiogenic therapies that can result in tumor necrosis. • Our pilot study demonstrates that taking volumetric density into account can better predict overall survival when compared to RECIST, volumetric size, MASS and Choi. • Although volumetric segmentation and further density analysis may not yet be feasible within routine workflows, the authors believe that technology advances may soon make this possible. - Abstract: Objectives: To evaluate the ability of new computed tomography (CT) response criteria for solid tumors such as urothelial cancer (VTV; viable tumor volume) to predict overall survival (OS) in patients with metastatic bladder cancer treated with cabozantinib. Materials and methods: We compared the relative capabilities of VTV, RECIST, MASS (morphology, attenuation, size, and structure), and Choi criteria, as well as volume measurements, to predict OS using serial follow-up contrast-enhanced CT exams in patients with metastatic urothelial carcinoma. Kaplan–Meier curves and 2-tailed log-rank tests compared OS based on early RECIST 1.1 response against each of the other criteria. A Cox proportional hazards model assessed response at follow-up exams as a time-varying covariate for OS. Results: We assessed 141 lesions in 55CT scans from 17 patients with urothelial metastasis, comparing VTV, RECIST, MASS, and Choi criteria, and volumetric measurements, for response assessment. Median follow-up was 4.5 months, range was 2–14 months. Only the VTV criteria demonstrated a statistical association with OS (p = 0.019; median OS 9.7 vs. 3.5 months). Conclusion: This pilot study suggests that VTV is a promising tool for assessing tumor response and predicting OS, using criteria that incorporate tumor volume and density in patients receiving antiangiogenic therapy for urothelial cancer. Larger studies are warranted to

  20. Computers and School Nurses in a Financially Stressed School System: The Case of St. Louis

    Science.gov (United States)

    Cummings, Scott

    2013-01-01

    This article describes the incorporation of computer technology into the professional lives of school nurses. St. Louis, Missouri, a major urban school system, is the site of the study. The research describes several major impacts computer technology has on the professional responsibilities of school nurses. Computer technology not only affects…

  1. Automated Analysis of Short Responses in an Interactive Synthetic Tutoring System for Introductory Physics

    Science.gov (United States)

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-01-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…

  2. Application of analogue computers to radiotracer data processing

    International Nuclear Information System (INIS)

    Chmielewski, A.G.

    1979-01-01

    Some applications of analogue computers for processing the flow-system radiotracer-investigation data are presented. Analysis of the impulse response shaped to obtain the frequency response of the system under consideration can be performed on the basis of an estimated transfer function. Furthermore, simulation of the system behaviour for other excitation functions is discussed. Simple approach is made for estimating the model parameters in situations where the input signal is not approximated by the unit impulse function. (author)

  3. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  4. Validation of a solid-fluid interaction computer program for the earthquake analysis of nuclear power reactors

    International Nuclear Information System (INIS)

    Dubois, J.; Descleve, P.; Dupont, Y.

    1978-01-01

    This paper evaluates a numerical method for the analysis of the mechanical response of nuclear reactor components composed of steel structures and fluids, during normal or accidental conditions. The method consists of computing the mode shapes and frequencies of the coupled system, with the assumption of small acoustic movements and incompressibility for the fluid. The paper validates the theory and its implementation in the computer program NOVAX (axisymmetric geometry, non axisymmetric loads and response for earthquake response studies) by comparison with known theoretical and experimental results. (author)

  5. Automated emergency meteorological response system

    International Nuclear Information System (INIS)

    Pepper, D.W.

    1980-01-01

    A sophisticated emergency response system was developed to aid in the evaluation of accidental releases of hazardous materials from the Savannah River Plant to the environment. A minicomputer system collects and archives data from both onsite meteorological towers and the National Weather Service. In the event of an accidental release, the computer rapidly calculates the trajectory and dispersion of pollutants in the atmosphere. Computer codes have been developed which provide a graphic display of predicted concentration profiles downwind from the source, as functions of time and distance

  6. Computational Design of Batteries from Materials to Systems

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Kandler A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Santhanagopalan, Shriram [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yang, Chuanbo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Graf, Peter A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Usseglio Viretta, Francois L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Li, Qibo [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Finegan, Donal [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Pesaran, Ahmad A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Yao, Koffi (Pierre) [Argonne National Laboratory; Abraham, Daniel [Argonne National Laboratory; Dees, Dennis [Argonne National Laboratory; Jansen, Andy [Argonne National Laboratory; Mukherjee, Partha [Texas A& M University; Mistry, Aashutosh [Texas A& M University; Verma, Ankit [Texas A& M University; Lamb, Josh [Sandia National Laboratories; Darcy, Eric [NASA

    2017-09-01

    Computer models are helping to accelerate the design and validation of next generation batteries and provide valuable insights not possible through experimental testing alone. Validated 3-D physics-based models exist for predicting electrochemical performance, thermal and mechanical response of cells and packs under normal and abuse scenarios. The talk describes present efforts to make the models better suited for engineering design, including improving their computation speed, developing faster processes for model parameter identification including under aging, and predicting the performance of a proposed electrode material recipe a priori using microstructure models.

  7. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  8. Computational and experimental investigation of dynamic shock reflection phenomena

    CSIR Research Space (South Africa)

    Naidoo, K

    2007-07-01

    Full Text Available wedge are used to analyse dynamic flow field phenomena and response of the triple point below and within the dual solution domain. Computed, unsteady pressure traces on the reflection plane are also analysed...

  9. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  10. Dynamic computer simulation of the Fort St. Vrain steam turbines

    International Nuclear Information System (INIS)

    Conklin, J.C.

    1983-01-01

    A computer simulation is described for the dynamic response of the Fort St. Vrain nuclear reactor regenerative intermediate- and low-pressure steam turbines. The fundamental computer-modeling assumptions for the turbines and feedwater heaters are developed. A turbine heat balance specifying steam and feedwater conditions at a given generator load and the volumes of the feedwater heaters are all that are necessary as descriptive input parameters. Actual plant data for a generator load reduction from 100 to 50% power (which occurred as part of a plant transient on November 9, 1981) are compared with computer-generated predictions, with reasonably good agreement

  11. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  12. Extreme Scale Computing to Secure the Nation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national security requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the

  13. Computed neutron response of spherical moderator-detector systems for radiation protection monitoring

    International Nuclear Information System (INIS)

    Dhairyawan, M.P.

    1979-01-01

    Neutrons of energies below 500 keV are important from the point of view of radiation protection of personnel working around reactors. However, as no neutron sources are available at lower energies, no measured values of neutron energy response are available between thermal and 0.5 MeV (but for Sb-Be source at 24 keV). The response functions in this range are, therefore, arrived at theoretically. After giving a comprehensive review of the work done in the field of response of moderated neutron detectors, a Monte Carlo method developed for this purpose is described and used to calculate energy response functions of the two spherical moderator-detector systems, namely, one using a central BF 3 counter and the other using 6 LiI(Eu) scintillator of 0.490 dia crystal. The polythene sphere diameter ranged from 2'' to 12''. The results obtained follow the trend predicted by other calculations and experiments, but are a definite improvement over them, because the most recent data on cross sections and angular distribution are used and the opacity of the detector i.e. the presence and size of the detector within the moderator is taken into account in the present calculations. The reasons for the discrepancies in the present results and those obtained earlier by other methods are discussed. The response of the Leake counter arrived at by the present method agrees very well with experimental calibration. (M.G.B.)

  14. Green Computing in Local Governments and Information Technology Companies

    Directory of Open Access Journals (Sweden)

    Badar Agung Nugroho

    2013-06-01

    Full Text Available Green computing is a study and practice of designing, manufacturing, using, and disposing of information and communication devices efficiently and effectively with minimum impact on the environment. If the green computing concept was implemented, it will help the agencies or companies to reduce energy and capital cost from their IT infrastructure. The goal from this research is to explore the current condition about the efforts from local governments and IT companies at West Java to implement the green computing concept at their working environment. The primary data were collected by using focus group discussion by inviting the local governments and IT companies representatives who responsible to manage their IT infrastructure. And then, the secondary data were collected by doing brief observation in order to see the real effort of green computing implementation at each institution. The result shows that there are many different perspectives and efforts of green computing implementation between local governments and IT companies.

  15. An approach to unfold the response of a multi-element system using an artificial neural network

    International Nuclear Information System (INIS)

    Cordes, E.; Fehrenbacher, G.; Schuetz, R.; Sprunck, M.; Hahn, K.; Hofmann, R.; Wahl, W.

    1998-01-01

    An unfolding procedure is proposed which aims at obtaining spectral information of a neutron radiation field by the analysis of the response of a multi-element system consisting of converter type semiconductors. For the unfolding procedure an artificial neural network (feed forward network), trained by the back-propagation method, was used. The response functions of the single elements to neutron radiation were calculated by application of a computational model for an energy range from 10 -2 eV to 10 MeV. The training of the artificial neural network was based on the computation of responses of a six-element system for a set of 300 neutron spectra and the application of the back-propagation method. The validation was performed by the unfolding of 100 computed responses. Two unfolding examples were pointed out for the determination of the neutron spectra. The spectra resulting from the unfolding procedure agree well with the original spectra used for the response computation

  16. Parallel R-matrix computation

    International Nuclear Information System (INIS)

    Heggarty, J.W.

    1999-06-01

    response to the challenge of achieving parallel R-matrix computation. The primary objective was to develop parallel codes, targeted at multicomputers, that are capable of performing R-matrix calculations hitherto intractable using classic supercomputers. In particular, Fortran implementations of two internal region methods (the R-matrix Floquet method and the two-dimensional R-matrix propagation method) and three external region methods (the Light-Walker propagation method, the Baluja, Burke and Morgan propagation method and the Variable Phase Method) from four widely utilised R-matrix packages were investigated to ascertain whether, in these cases, parallel R-matrix computation was practicable and, if so, to determine the most effective way to port such codes to contemporary multicomputers. When attempting to develop the parallel codes, a number of computer aided automatic parallelization tools were investigated. These were found to be inadequate. Consequently, a parallelization approach was developed to provide simple guidelines for manual parallelization. This parallelization approach proved effective and efficient parallel versions of the five R-matrix codes were successfully developed. (author)

  17. Noninvasive photoacoustic computed tomography of mouse brain metabolism in vivo

    OpenAIRE

    Yao, Junjie; Xia, Jun; Maslov, Konstantin I.; Nasiriavanaki, Mohammadreza; Tsytsarev, Vassiliy; Demchenko, Alexei V.; Wang, Lihong V.

    2012-01-01

    We have demonstrated the feasibility of imaging mouse brain metabolism using photoacoustic computed tomography (PACT), a fast, noninvasive and functional imaging modality with optical contrast and acoustic resolution. Brain responses to forepaw stimulations were imaged transdermally and transcranially. 2-NBDG, which diffuses well across the blood–brain-barrier, provided exogenous contrast for photoacoustic imaging of glucose response. Concurrently, hemoglobin provided endogenous contrast for ...

  18. SONATINA-1: a computer program for seismic response analysis of column in HTGR core

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1980-11-01

    An computer program SONATINA-1 for predicting the behavior of a prismatic high-temperature gas-cooled reactor (HTGR) core under seismic excitation has been developed. In this analytical method, blocks are treated as rigid bodies and are constrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions. Coulomb friction between blocks and between dowel holes and pins is also considered. A spring dashpot model is used for the collision process between adjacent blocks and between blocks and boundary walls. Analytical results are compared with experimental results and are found to be in good agreement. The computer program can be used to predict the behavior of the HTGR core under seismic excitation. (author)

  19. Computational compliance criteria in water hammer modelling

    Directory of Open Access Journals (Sweden)

    Urbanowicz Kamil

    2017-01-01

    Full Text Available Among many numerical methods (finite: difference, element, volume etc. used to solve the system of partial differential equations describing unsteady pipe flow, the method of characteristics (MOC is most appreciated. With its help, it is possible to examine the effect of numerical discretisation carried over the pipe length. It was noticed, based on the tests performed in this study, that convergence of the calculation results occurred on a rectangular grid with the division of each pipe of the analysed system into at least 10 elements. Therefore, it is advisable to introduce computational compliance criteria (CCC, which will be responsible for optimal discretisation of the examined system. The results of this study, based on the assumption of various values of the Courant-Friedrichs-Levy (CFL number, indicate also that the CFL number should be equal to one for optimum computational results. Application of the CCC criterion to own written and commercial computer programmes based on the method of characteristics will guarantee fast simulations and the necessary computational coherence.

  20. Proceedings of the International Computer Music Conference

    DEFF Research Database (Denmark)

    It is a pleasure to welcome everyone to the 2007 International Computer Music Conference, hosted in Copenhagen from the 27th to the 31st of August. ICMC2007 is organized by Re:New - Digital arts forum, in collaboration with the International Computer Music Association and Medialogy at Aalborg Uni...... and interactive music and circus. The cross-disciplinary nature of ICMC is well represented by the Medialogy education, a new initiative started in Esbjerg and now well established in Copenhagen.......It is a pleasure to welcome everyone to the 2007 International Computer Music Conference, hosted in Copenhagen from the 27th to the 31st of August. ICMC2007 is organized by Re:New - Digital arts forum, in collaboration with the International Computer Music Association and Medialogy at Aalborg...... efficient in providing feedback and comments on the numerous papers submitted. The response to the call for participation was very positive. We received 290 paper submissions and 554 music submissions. We are extremely grateful for the interest and support from around the world. It is an honor to welcome...

  1. Solving eigenvalue response matrix equations with nonlinear techniques

    International Nuclear Information System (INIS)

    Roberts, Jeremy A.; Forget, Benoit

    2014-01-01

    Highlights: • High performance solvers were applied within ERMM for the first time. • Accelerated fixed-point methods were developed that reduce computational times by 2–3. • A nonlinear, Newton-based ERMM led to similar improvement and more robustness. • A 3-D, SN-based ERMM shows how ERMM can apply fine-mesh methods to full-core analysis. - Abstract: This paper presents new algorithms for use in the eigenvalue response matrix method (ERMM) for reactor eigenvalue problems. ERMM spatially decomposes a domain into independent nodes linked via boundary conditions approximated as truncated orthogonal expansions, the coefficients of which are response functions. In its simplest form, ERMM consists of a two-level eigenproblem: an outer Picard iteration updates the k-eigenvalue via balance, while the inner λ-eigenproblem imposes neutron balance between nodes. Efficient methods are developed for solving the inner λ-eigenvalue problem within the outer Picard iteration. Based on results from several diffusion and transport benchmark models, it was found that the Krylov–Schur method applied to the λ-eigenvalue problem reduces Picard solver times (excluding response generation) by a factor of 2–5. Furthermore, alternative methods, including Picard acceleration schemes, Steffensen’s method, and Newton’s method, are developed in this paper. These approaches often yield faster k-convergence and a need for fewer k-dependent response function evaluations, which is important because response generation is often the primary cost for problems using responses computed online (i.e., not from a precomputed database). Accelerated Picard iteration was found to reduce total computational times by 2–3 compared to the unaccelerated case for problems dominated by response generation. In addition, Newton’s method was found to provide nearly the same performance with improved robustness

  2. PUMN: part I of the WINERY radiation damage computer simulation system

    International Nuclear Information System (INIS)

    Kuspa, J.P.; Edwards, D.R.; Tsoulfanidis, N.

    1976-01-01

    Results of computer work to simulate the response of crystalline materials to radiation are presented. To organize this and future work into a long range program of investigation, the WINERY Radiation Damage Computer Simulation System is proposed. The WINERY system is designed to solve the entire radiation damage problem from the incident radiation to the property changes which occur in the material, using a set of interrelated computer programs. One portion of the system, the PUMN program, has been used to obtain important radiation damage results with Fe 3 Al crystal. PUMN simulates the response of the atoms in a crystal to a knock-on atom. It yields the damage configuration of the crystal by considering the dynamic interaction of all the atoms of the computational cell, up to 1000 atoms. The PUMN program provides the WINERY system with results for the number of displacements, N/sub d/, due to knock-on atoms with various energies. The values of N/sub d/ for Fe 3 Al were obtained at two different energies, 100 and 500 eV, for a variety of initial directions. These values are to be used to form a table of results for use in WINERY

  3. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  4. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  5. Expansion of the TFTR neutral beam computer system

    International Nuclear Information System (INIS)

    McEnerney, J.; Chu, J.; Davis, S.; Fitzwater, J.; Fleming, G.; Funk, P.; Hirsch, J.; Lagin, L.; Locasak, V.; Randerson, L.; Schechtman, N.; Silber, K.; Skelly, G.; Stark, W.

    1992-01-01

    Previous TFTR Neutral Beam computing support was based primarily on an Encore Concept 32/8750 computer within the TFTR Central Instrumentation, Control and Data Acquisition System (CICADA). The resources of this machine were 90% utilized during a 2.5 minute duty cycle. Both interactive and automatic processes were supported, with interactive response suffering at lower priority. Further, there were additional computing requirements and no cost effective path for expansion within the Encore framework. Two elements provided a solution to these problems: improved price performance for computing and a high speed bus link to the SELBUS. The purchase of a Sun SPARCstation and a VME/SELBUS bus link, allowed offloading the automatic processing to the workstation. This paper describes the details of the system including the performance of the bus link and Sun SPARCstation, raw data acquisition and data server functions, application software conversion issues, and experiences with the UNIX operating system in the mixed platform environment

  6. Computed radiography systems performance evaluation

    International Nuclear Information System (INIS)

    Xavier, Clarice C.; Nersissian, Denise Y.; Furquim, Tania A.C.

    2009-01-01

    The performance of a computed radiography system was evaluated, according to the AAPM Report No. 93. Evaluation tests proposed by the publication were performed, and the following nonconformities were found: imaging p/ate (lP) dark noise, which compromises the clinical image acquired using the IP; exposure indicator uncalibrated, which can cause underexposure to the IP; nonlinearity of the system response, which causes overexposure; resolution limit under the declared by the manufacturer and erasure thoroughness uncalibrated, impairing structures visualization; Moire pattern visualized at the grid response, and IP Throughput over the specified by the manufacturer. These non-conformities indicate that digital imaging systems' lack of calibration can cause an increase in dose in order that image prob/ems can be so/ved. (author)

  7. Computational Design Modelling : Proceedings of the Design Modelling Symposium

    CERN Document Server

    Kilian, Axel; Palz, Norbert; Scheurer, Fabian

    2012-01-01

    This book publishes the peer-reviewed proceeding of the third Design Modeling Symposium Berlin . The conference constitutes a platform for dialogue on experimental practice and research within the field of computationally informed architectural design. More than 60 leading experts the computational processes within the field of computationally informed architectural design to develop a broader and less exotic building practice that bears more subtle but powerful traces of the complex tool set and approaches we have developed and studied over recent years. The outcome are new strategies for a reasonable and innovative implementation of digital potential in truly innovative and radical design guided by both responsibility towards processes and the consequences they initiate.

  8. CLOUD COMPUTING ADOPTION STRATEGIES AT PT TASPEN INDONESIA, Tbk

    Directory of Open Access Journals (Sweden)

    Julirzal Sarmedy

    2014-10-01

    Full Text Available PT. Taspen as Indonesian institution, is responsible for managing social insuranceprograms of civil servants. With branch offices and business partners who are geographicallydispersed throughout Indonesia, information technology is very important to support thebusiness processes. Cloud computing is a model of information technology services that couldpotentially increase the effectiveness and efficiency of PT. Taspen information system. Thisstudy examines the phenomenon exists at PT. Taspen in order to adopt cloud computing inthe information system, by using the framework of Technology-Organization-Environment,Diffusion of Innovation theory, and Partial Least Square method. Organizational factor isthe most dominant for PT. Taspen to adopt cloud computing. Referring to these findings,then a SWOT analysis and TOWS matrix are performed, which in this study recommendsthe implementation of a strategy model of cloud computing services that are private andgradually in process.

  9. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  10. Digital Ethics: Computers, Photographs, and the Manipulation of Pixels.

    Science.gov (United States)

    Mercedes, Dawn

    1996-01-01

    Summarizes negative aspects of computer technology and problems inherent in the field of digital imaging. Considers the postmodernist response that borrowing and alteration are essential characteristics of the technology. Discusses the implications of this for education and research. (MJP)

  11. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  12. Detailed comparison between computed and measured FBR core seismic responses

    International Nuclear Information System (INIS)

    Forni, M.; Martelli, A.; Melloni, R.; Bonacina, G.

    1988-01-01

    This paper presents a detailed comparison between seismic calculations and measurements performed for various mock-ups consisting of groups of seven and nineteen simplified elements of the Italian PEC fast reactor core. Experimental tests had been performed on shaking tables in air and water (simulating sodium) with excitations increasing up to above Safe Shutdown Earthquake. The PEC core-restraint ring had been simulated in some tests. All the experimental tests have been analysed by use of both the one-dimensional computer program CORALIE and the two-dimensional program CLASH. Comparisons have been made for all the instrumented elements, in both the time and the frequency domains. The good agreement between calculations and measurements has confirmed adequacy of the fluid-structure interaction model used for PEC core seismic design verification

  13. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  14. Towards Computing Ratcheting and Training in Superconducting Magnets

    International Nuclear Information System (INIS)

    Ferracin, Paolo; Caspi, Shlomo; Lietzke, A.F.

    2007-01-01

    The Superconducting Magnet Group at Lawrence Berkeley National Laboratory (LBNL) has been developing 3D finite element models to predict the behavior of high field Nb 3 Sn superconducting magnets. The models track the coil response during assembly, cool-down, and excitation, with particular interest on displacements when frictional forces arise. As Lorentz forces were cycled, irreversible displacements were computed and compared with strain gauge measurements. Additional analysis was done on the local frictional energy released during magnet excitation, and the resulting temperature rise. Magnet quenching and training was correlated to the level of energy release during such mechanical displacements under frictional forces. We report in this paper the computational results of the ratcheting process, the impact of friction, and the path-dependent energy release leading to a computed magnet training curve

  15. USSR orders computers to improve nuclear safety

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    Control Data Corp (CDC) has received an order valued at $32-million from the Soviet Union for six Cyber 962 mainframe computer systems to be used to increase the safety of civilian nuclear powerplants. The firm is now waiting for approval of the contract by the US government and Western Allies. The computers, ordered by the Soviet Research and Development Institute of Power Engineering (RDIPE), will analyze safety factors in the operation of nuclear reactors over a wide range of conditions. The Soviet Union's civilian nuclear program is one of the largest in the world, with over 50 plants in operation. Types of safety analyses the computers perform include: neutron-physics calculations, radiation-protection studies, stress analysis, reliability analysis of equipment and systems, ecological-impact calculations, transient analysis, and support activities for emergency response. They also include a simulator with realistic mathematical models of Soviet nuclear powerplants to improve operator training

  16. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  17. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  18. Integral transport computation of gamma detector response with the CPM2 code

    International Nuclear Information System (INIS)

    Jones, D.B.

    1989-12-01

    CPM-2 Version 3 is an enhanced version of the CPM-2 lattice physics computer code which supports the capabilities to (1) perform a two-dimensional gamma flux calculation and (2) perform Restart/Data file maintenance operations. The Gamma Calculation Module implemented in CPM-2 was first developed for EPRI in the CASMO-1 computer code by Studsvik Energiteknik under EPRI Agreement RP2352-01. The gamma transport calculation uses the CPM-HET code module to calculate the transport of gamma rays in two dimensions in a mixed cylindrical-rectangular geometry, where the basic fuel assembly and component regions are maintained in a rectangular geometry, but the fuel pins are represented as cylinders within a square pin cell mesh. Such a capability is needed to represent gamma transport in an essentially transparent medium containing spatially distributed ''black'' cylindrical pins. Under a subcontract to RP2352-01, RPI developed the gamma production and gamma interaction library used for gamma calculation. The CPM-2 gamma calculation was verified against reference results generated by Studsvik using the CASMO-1 program. The CPM-2 Restart/Data file maintenance capabilities provide the user with options to copy files between Restart/Data tapes and to purge files from the Restart/Data tapes

  19. Morphologic and Metabolic Comparison of Treatment Responsiveness with 18Fludeoxyglucose-Positron Emission Tomography/Computed Tomography According to Lung Cancer Type

    Directory of Open Access Journals (Sweden)

    Mehmet Fatih Börksüz

    2016-06-01

    Full Text Available Objective: The aim of the present study was to evaluate the response to treatment by histopathologic type in patients with lung cancer and under follow-up with 18F-fluoro-2deoxy-glucose-positron emission tomography/computed tomography (18F-FDG PET/CT imaging by using Response Evaluation Criteria in Solid Tumors (RECIST and European Organisation for Research and Treatment of Cancer (EORTC criteria that evaluate morphologic and metabolic parameters. Methods: On two separate (pre- and post-treatment 18F-FDG PET/CT images, the longest dimension of primary tumor as well as of secondary lesions were measured and sum of these two measurements was recorded as the total dimension in 40 patients. PET parameters such as standardized uptake value (SUVmax, metabolic volume and total lesion glycolysis (TLG were also recorded for these target lesions on two separate 18F-FDG PET/CT images. The percent (% change was calculated for all these parameters. Morphologic evaluation was based on RECIST 1.1 and the metabolic evaluation was based on EORTC. Results: When evaluated before and after treatment, in spite of the statistically significant change (p0.05. In histopathologic typing, when we compare the post-treatment phase change with the treatment responses of RECIST 1.1 and EORTC criteria; for RECIST 1.1 in squamous cell lung cancer group, progression was observed in sixteen patients (57%, stability in seven patients (25%, partial response in five patients (18%; and for EORTC progression was detected in four patients (14%, stability in thirteen patients (47%, partial response in eleven patients (39%, in 12 of these patients an increase in stage (43%, in 4 of them a decrease in stage (14%, and in 12 of them stability in stage (43% were determined. But in adenocancer patients (n=7, for RECIST 1.1, progression was determined in four patients (57%, stability in two patients (29%, partial response in one patient (14%; for EORTC, progression in one patient (14

  20. Two- and three-input TALE-based AND logic computation in embryonic stem cells.

    Science.gov (United States)

    Lienert, Florian; Torella, Joseph P; Chen, Jan-Hung; Norsworthy, Michael; Richardson, Ryan R; Silver, Pamela A

    2013-11-01

    Biological computing circuits can enhance our ability to control cellular functions and have potential applications in tissue engineering and medical treatments. Transcriptional activator-like effectors (TALEs) represent attractive components of synthetic gene regulatory circuits, as they can be designed de novo to target a given DNA sequence. We here demonstrate that TALEs can perform Boolean logic computation in mammalian cells. Using a split-intein protein-splicing strategy, we show that a functional TALE can be reconstituted from two inactive parts, thus generating two-input AND logic computation. We further demonstrate three-piece intein splicing in mammalian cells and use it to perform three-input AND computation. Using methods for random as well as targeted insertion of these relatively large genetic circuits, we show that TALE-based logic circuits are functional when integrated into the genome of mouse embryonic stem cells. Comparing construct variants in the same genomic context, we modulated the strength of the TALE-responsive promoter to improve the output of these circuits. Our work establishes split TALEs as a tool for building logic computation with the potential of controlling expression of endogenous genes or transgenes in response to a combination of cellular signals.

  1. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  2. Computers, business, and security the new role for security

    CERN Document Server

    Schweitzer, James A

    1987-01-01

    Computers, Business, and Security: The New Role for Security addresses the professional security manager's responsibility to protect all business resources, with operating environments and high technology in mind. This book discusses the technological aspects of the total security programs.Organized into three parts encompassing 10 chapters, this book begins with an overview of how the developing information age is affecting business management, operations, and organization. This text then examines a number of vulnerabilities that arise in the process of using business computing and communicat

  3. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  4. The myth of secure computing.

    Science.gov (United States)

    Austin, Robert D; Darby, Christopher A

    2003-06-01

    Few senior executives pay a whole lot of attention to computer security. They either hand off responsibility to their technical people or bring in consultants. But given the stakes involved, an arm's-length approach is extremely unwise. According to industry estimates, security breaches affect 90% of all businesses every year and cost some $17 billion. Fortunately, the authors say, senior executives don't need to learn about the more arcane aspects of their company's IT systems in order to take a hands-on approach. Instead, they should focus on the familiar task of managing risk. Their role should be to assess the business value of their information assets, determine the likelihood that those assets will be compromised, and then tailor a set of risk abatement processes to their company's particular vulnerabilities. This approach, which views computer security as an operational rather than a technical challenge, is akin to a classic quality assurance program in that it attempts to avoid problems rather than fix them and involves all employees, not just IT staffers. The goal is not to make computer systems completely secure--that's impossible--but to reduce the business risk to an acceptable level. This article looks at the types of threats a company is apt to face. It also examines the processes a general manager should spearhead to lessen the likelihood of a successful attack. The authors recommend eight processes in all, ranging from deciding how much protection each digital asset deserves to insisting on secure software to rehearsing a response to a security breach. The important thing to realize, they emphasize, is that decisions about digital security are not much different from other cost-benefit decisions. The tools general managers bring to bear on other areas of the business are good models for what they need to do in this technical space.

  5. A Computational Model of Cellular Response to Modulated Radiation Fields

    Energy Technology Data Exchange (ETDEWEB)

    McMahon, Stephen J., E-mail: stephen.mcmahon@qub.ac.uk [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Butterworth, Karl T. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); McGarry, Conor K. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Trainor, Colman [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); O' Sullivan, Joe M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Clinical Oncology, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Belfast, Northern Ireland (United Kingdom); Hounsell, Alan R. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom); Radiotherapy Physics, Northern Ireland Cancer Centre, Belfast Health and Social Care Trust, Northern Ireland (United Kingdom); Prise, Kevin M. [Centre for Cancer Research and Cell Biology, Queen' s University Belfast, Belfast, Northern Ireland (United Kingdom)

    2012-09-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  6. A Computational Model of Cellular Response to Modulated Radiation Fields

    International Nuclear Information System (INIS)

    McMahon, Stephen J.; Butterworth, Karl T.; McGarry, Conor K.; Trainor, Colman; O’Sullivan, Joe M.; Hounsell, Alan R.; Prise, Kevin M.

    2012-01-01

    Purpose: To develop a model to describe the response of cell populations to spatially modulated radiation exposures of relevance to advanced radiotherapies. Materials and Methods: A Monte Carlo model of cellular radiation response was developed. This model incorporated damage from both direct radiation and intercellular communication including bystander signaling. The predictions of this model were compared to previously measured survival curves for a normal human fibroblast line (AGO1522) and prostate tumor cells (DU145) exposed to spatially modulated fields. Results: The model was found to be able to accurately reproduce cell survival both in populations which were directly exposed to radiation and those which were outside the primary treatment field. The model predicts that the bystander effect makes a significant contribution to cell killing even in uniformly irradiated cells. The bystander effect contribution varies strongly with dose, falling from a high of 80% at low doses to 25% and 50% at 4 Gy for AGO1522 and DU145 cells, respectively. This was verified using the inducible nitric oxide synthase inhibitor aminoguanidine to inhibit the bystander effect in cells exposed to different doses, which showed significantly larger reductions in cell killing at lower doses. Conclusions: The model presented in this work accurately reproduces cell survival following modulated radiation exposures, both in and out of the primary treatment field, by incorporating a bystander component. In addition, the model suggests that the bystander effect is responsible for a significant portion of cell killing in uniformly irradiated cells, 50% and 70% at doses of 2 Gy in AGO1522 and DU145 cells, respectively. This description is a significant departure from accepted radiobiological models and may have a significant impact on optimization of treatment planning approaches if proven to be applicable in vivo.

  7. Multi-scale analysis of lung computed tomography images

    CERN Document Server

    Gori, I; Fantacci, M E; Preite Martinez, A; Retico, A; De Mitri, I; Donadio, S; Fulcheri, C

    2007-01-01

    A computer-aided detection (CAD) system for the identification of lung internal nodules in low-dose multi-detector helical Computed Tomography (CT) images was developed in the framework of the MAGIC-5 project. The three modules of our lung CAD system, a segmentation algorithm for lung internal region identification, a multi-scale dot-enhancement filter for nodule candidate selection and a multi-scale neural technique for false positive finding reduction, are described. The results obtained on a dataset of low-dose and thin-slice CT scans are shown in terms of free response receiver operating characteristic (FROC) curves and discussed.

  8. Military computer games and the new American militarism: what computer games teach us about war

    OpenAIRE

    Thomson, Matthew Ian Malcolm

    2009-01-01

    Military computer games continue to evoke a uniquely contradictory public, intellectual, and critical response. Whilst denigrated as child’s play, they are played by millions of adults; whilst dismissed as simplistic, they are used in education, therapy, and military training; and whilst classed as meaningless, they arouse fears over media effects and the propagandist influence of their representations of combat. They remain the object of intense suspicion, and as part of a new and growing ma...

  9. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  10. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  11. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1991-01-01

    In the process of review and evaluation of licensing issues related to nuclear power plants, it is essential to understand the behavior of seismic loading, foundation and structural properties and their impact on the overall structural response. In most cases, such knowledge could be obtained by using simplified engineering models which, when properly implemented, can capture the essential parameters describing the physics of the problem. Such models do not require execution on large computer systems and could be implemented through a personal computer (PC) based capability. Recognizing the need for a PC software package that can perform structural response computations required for typical licensing reviews, the US Nuclear Regulatory Commission sponsored the development of a PC operated computer software package CARES (Computer Analysis for Rapid Evaluation of Structures) system. This development was undertaken by Brookhaven National Laboratory (BNL) during FY's 1988 and 1989. A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to operate on a PC, have user friendly input/output interface, and have quick turnaround. This paper describes the various features which have been implemented into the seismic module of CARES version 1.0

  12. Complex computation in the retina

    Science.gov (United States)

    Deshmukh, Nikhil Rajiv

    Elucidating the general principles of computation in neural circuits is a difficult problem requiring both a tractable model circuit as well as sophisticated measurement tools. This thesis advances our understanding of complex computation in the salamander retina and its underlying circuitry and furthers the development of advanced tools to enable detailed study of neural circuits. The retina provides an ideal model system for neural circuits in general because it is capable of producing complex representations of the visual scene, and both its inputs and outputs are accessible to the experimenter. Chapter 2 describes the biophysical mechanisms that give rise to the omitted stimulus response in retinal ganglion cells described in Schwartz et al., (2007) and Schwartz and Berry, (2008). The extra response to omitted flashes is generated at the input to bipolar cells, and is separable from the characteristic latency shift of the OSR apparent in ganglion cells, which must occur downstream in the circuit. Chapter 3 characterizes the nonlinearities at the first synapse of the ON pathway in response to high contrast flashes and develops a phenomenological model that captures the effect of synaptic activation and intracellular signaling dynamics on flash responses. This work is the first attempt to model the dynamics of the poorly characterized mGluR6 transduction cascade unique to ON bipolar cells, and explains the second lobe of the biphasic flash response. Complementary to the study of neural circuits, recent advances in wafer-scale photolithography have made possible new devices to measure the electrical and mechanical properties of neurons. Chapter 4 reports a novel piezoelectric sensor that facilitates the simultaneous measurement of electrical and mechanical signals in neural tissue. This technology could reveal the relationship between the electrical activity of neurons and their local mechanical environment, which is critical to the study of mechanoreceptors

  13. Solving a Hamiltonian Path Problem with a bacterial computer

    Directory of Open Access Journals (Sweden)

    Treece Jessica

    2009-07-01

    Full Text Available Abstract Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node

  14. Solving a Hamiltonian Path Problem with a bacterial computer

    Science.gov (United States)

    Baumgardner, Jordan; Acker, Karen; Adefuye, Oyinade; Crowley, Samuel Thomas; DeLoache, Will; Dickson, James O; Heard, Lane; Martens, Andrew T; Morton, Nickolaus; Ritter, Michelle; Shoecraft, Amber; Treece, Jessica; Unzicker, Matthew; Valencia, Amanda; Waters, Mike; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Eckdahl, Todd T

    2009-01-01

    Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node directed graph. This proof

  15. Accurate and efficient calculation of response times for groundwater flow

    Science.gov (United States)

    Carr, Elliot J.; Simpson, Matthew J.

    2018-03-01

    We study measures of the amount of time required for transient flow in heterogeneous porous media to effectively reach steady state, also known as the response time. Here, we develop a new approach that extends the concept of mean action time. Previous applications of the theory of mean action time to estimate the response time use the first two central moments of the probability density function associated with the transition from the initial condition, at t = 0, to the steady state condition that arises in the long time limit, as t → ∞ . This previous approach leads to a computationally convenient estimation of the response time, but the accuracy can be poor. Here, we outline a powerful extension using the first k raw moments, showing how to produce an extremely accurate estimate by making use of asymptotic properties of the cumulative distribution function. Results are validated using an existing laboratory-scale data set describing flow in a homogeneous porous medium. In addition, we demonstrate how the results also apply to flow in heterogeneous porous media. Overall, the new method is: (i) extremely accurate; and (ii) computationally inexpensive. In fact, the computational cost of the new method is orders of magnitude less than the computational effort required to study the response time by solving the transient flow equation. Furthermore, the approach provides a rigorous mathematical connection with the heuristic argument that the response time for flow in a homogeneous porous medium is proportional to L2 / D , where L is a relevant length scale, and D is the aquifer diffusivity. Here, we extend such heuristic arguments by providing a clear mathematical definition of the proportionality constant.

  16. High-speed LWR transients simulation for optimizing emergency response

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Lekach, S.V.; Mallen, A.N.; Stritar, A.

    1984-01-01

    The purpose of computer-assisted emergency response in nuclear power plants, and the requirements for achieving such a response, are presented. An important requirement is the attainment of realistic high-speed plant simulations at the reactor site. Currently pursued development programs for plant simulations are reviewed. Five modeling principles are established and a criterion is presented for selecting numerical procedures and efficient computer hardware to achieve high-speed simulations. A newly developed technology for high-speed power plant simulation is described and results are presented. It is shown that simulation speeds ten times greater than real-time process-speeds are possible, and that plant instrumentation can be made part of the computational loop in a small, on-site minicomputer. Additional technical issues are presented which must still be resolved before the newly developed technology can be implemented in a nuclear power plant

  17. Comparison of computer simulated and observed force deformation characteristics of anti-seismic devices and isolated structures

    International Nuclear Information System (INIS)

    Bhoje, S.B.; Chellapandi, P.; Chetal, S.; Muralikrishna, R.; Salvaraj, T.

    2002-01-01

    The paper discusses the finite element analysis of the force deformation characteristics of high damping rubber bearings, lead rubber bearings and natural rubber bearings. The dynamic response of structures isolated using bearings is also presented. The general purpose finite element program ABAQUS has been used for the numerical predictions under monotonic loads. For computing the dynamic response, a simplified model of the rubber bearing in the form of elasto-plastic system is used. This equivalent model is implemented using the computer code CASTEM-2000 and the dynamic response is obtained. The numerical results are found to match well with the experimental results. (author)

  18. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    Science.gov (United States)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  19. Heat dissipation computations of a HVDC ground electrode using a supercomputer

    International Nuclear Information System (INIS)

    Greiss, H.; Mukhedkar, D.; Lagace, P.J.

    1990-01-01

    This paper reports on the temperature, of soil surrounding a High Voltage Direct Current (HVDC) toroidal ground electrode of practical dimensions, in both homogeneous and non-homogeneous soils that was computed at incremental points in time using finite difference methods on a supercomputer. Curves of the response were computed and plotted at several locations within the soil in the vicinity of the ground electrode for various values of the soil parameters

  20. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  1. ON-BOARD COMPUTER SYSTEM FOR KITSAT-1 AND 2

    Directory of Open Access Journals (Sweden)

    H. S. Kim

    1996-06-01

    Full Text Available KITSAT-1 and 2 are microsatellites weighting 50kg and all the on-board data are processed by the on-board computer system. Hence, these on-board computers require to be highly reliable and be designed with tight power consumption, mass and size constraints. On-board computer(OBC systems for KITSAT-1 and 2 are also designed with a simple flexible hardware for reliability and software takes more responsibility than hardware. KITSAT-1 and 2 on-board computer system consist of OBC 186 as the primary OBC and OBC80 as its backup. OBC186 runs spacecraft operating system (SCOS which has real-time multi-tasking capability. Since their launch, OBC186 and OBC80 have been operating successfully until today. In this paper, we describe the development of OBC186 hardware and software and analyze its in-orbit operation performance.

  2. Copyright and personal use of CERN’s computing infrastructure

    CERN Multimedia

    IT Department

    2009-01-01

    (La version française sera en ligne prochainement)The rules covering the personal use of CERN’s computing infrastructure are defined in Operational Circular No. 5 and its Subsidiary Rules (see http://cern.ch/ComputingRules). All users of CERN’s computing infrastructure must comply with these rules, whether they access CERN’s computing facilities from within the Organization’s site or at another location. In particular, OC5 clause 17 requires that proprietary rights (the rights in software, music, video, etc.) must be respected. The user is liable for damages resulting from non-compliance. Recently, there have been several violations of OC5, where copyright material was discovered on public world-readable disk space. Please ensure that all material under your responsibility (in particular in files owned by your account) respects proprietary rights, including with respect to the restriction of access by third parties. CERN Security Team

  3. Emerging Nanophotonic Applications Explored with Advanced Scientific Parallel Computing

    Science.gov (United States)

    Meng, Xiang

    The domain of nanoscale optical science and technology is a combination of the classical world of electromagnetics and the quantum mechanical regime of atoms and molecules. Recent advancements in fabrication technology allows the optical structures to be scaled down to nanoscale size or even to the atomic level, which are far smaller than the wavelength they are designed for. These nanostructures can have unique, controllable, and tunable optical properties and their interactions with quantum materials can have important near-field and far-field optical response. Undoubtedly, these optical properties can have many important applications, ranging from the efficient and tunable light sources, detectors, filters, modulators, high-speed all-optical switches; to the next-generation classical and quantum computation, and biophotonic medical sensors. This emerging research of nanoscience, known as nanophotonics, is a highly interdisciplinary field requiring expertise in materials science, physics, electrical engineering, and scientific computing, modeling and simulation. It has also become an important research field for investigating the science and engineering of light-matter interactions that take place on wavelength and subwavelength scales where the nature of the nanostructured matter controls the interactions. In addition, the fast advancements in the computing capabilities, such as parallel computing, also become as a critical element for investigating advanced nanophotonic devices. This role has taken on even greater urgency with the scale-down of device dimensions, and the design for these devices require extensive memory and extremely long core hours. Thus distributed computing platforms associated with parallel computing are required for faster designs processes. Scientific parallel computing constructs mathematical models and quantitative analysis techniques, and uses the computing machines to analyze and solve otherwise intractable scientific challenges. In

  4. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  5. Concurrent performance in a three-alternative choice situation: response allocation in a Rock/Paper/Scissors game.

    Science.gov (United States)

    Kangas, Brian D; Berry, Meredith S; Cassidy, Rachel N; Dallery, Jesse; Vaidya, Manish; Hackenberg, Timothy D

    2009-10-01

    Adult human subjects engaged in a simulated Rock/Paper/Scissors game against a computer opponent. The computer opponent's responses were determined by programmed probabilities that differed across 10 blocks of 100 trials each. Response allocation in Experiment 1 was well described by a modified version of the generalized matching equation, with undermatching observed in all subjects. To assess the effects of instructions on response allocation, accurate probability-related information on how the computer was programmed to respond was provided to subjects in Experiment 2. Five of 6 subjects played the counter response of the computer's dominant programmed response near-exclusively (e.g., subjects played paper almost exclusively if the probability of rock was high), resulting in minor overmatching, and higher reinforcement rates relative to Experiment 1. On the whole, the study shows that the generalized matching law provides a good description of complex human choice in a gaming context, and illustrates a promising set of laboratory methods and analytic techniques that capture important features of human choice outside the laboratory.

  6. Shaping of neuronal activity through a Brain Computer Interface

    OpenAIRE

    Valero-Aguayo, Luis; Silva-Sauer, Leandro; Velasco-Alvarez, Ricardo; Ron-Angevin, Ricardo

    2014-01-01

    Neuronal responses are human actions which can be measured by an EEG, and which imply changes in waves when neurons are synchronized. This activity could be changed by principles of behaviour analysis. This research tests the efficacy of the behaviour shaping procedure to progressively change neuronal activity, so that those brain responses are adapted according to the differential reinforcement of visual feedback. The Brain Computer Interface (BCI) enables us to record the EEG in real ti...

  7. Studies on modeling to failed fuel detection system response in LMFBR

    International Nuclear Information System (INIS)

    Miyazawa, T.; Saji, G.; Mitsuzuku, N.; Hikichi, T.; Odo, T.; Rindo, H.

    1981-05-01

    Failed Fuel Detection (FFD) system with Fission Products (FP) detection is considered to be the most promissing method, since FP provides direct information against fuel element failure. For designing FFD system and for evaluating FFD signals, some adequate FFD signal response to fuel failure have been required. But few models are available in nowadays. Thus Power Reactor and Nuclear Fuel Development Corporation (PNC) had developed FFD response model with computer codes, based on several fundamental investigations on FP release and FP behavior, and referred to foreign country experiences on fuel failure. In developing the model, noble gas and halogen FP release and behavior were considered, since FFD system would be composed of both cover gas monitoring and delayed neutron monitoring. The developed model can provide typical fuel failure response and detection limit which depends on various background signals at cover gas monitoring and delayed neutron monitoring. According to the FFD response model, we tried to assume fuel failure response and detection limit at Japan experimental fast reactor ''JOYO''. The detection limit of JOYO FFD system was estimated by measuring the background signals. Followed on the studies, a complete computer code has been now made with some improvement. On the paper, the details of the model, out line of developed computer code, status of JOYO FFD system, and trial assumption of JOYO FFD response and detection limit. (author)

  8. Robot-Arm Dynamic Control by Computer

    Science.gov (United States)

    Bejczy, Antal K.; Tarn, Tzyh J.; Chen, Yilong J.

    1987-01-01

    Feedforward and feedback schemes linearize responses to control inputs. Method for control of robot arm based on computed nonlinear feedback and state tranformations to linearize system and decouple robot end-effector motions along each of cartesian axes augmented with optimal scheme for correction of errors in workspace. Major new feature of control method is: optimal error-correction loop directly operates on task level and not on joint-servocontrol level.

  9. Calculating buoy response for a wave energy converter—A comparison of two computational methods and experimental results

    Directory of Open Access Journals (Sweden)

    Linnea Sjökvist

    2017-05-01

    Full Text Available When designing a wave power plant, reliable and fast simulation tools are required. Computational fluid dynamics (CFD software provides high accuracy but with a very high computational cost, and in operational, moderate sea states, linear potential flow theories may be sufficient to model the hydrodynamics. In this paper, a model is built in COMSOL Multiphysics to solve for the hydrodynamic parameters of a point-absorbing wave energy device. The results are compared with a linear model where the hydrodynamical parameters are computed using WAMIT, and to experimental results from the Lysekil research site. The agreement with experimental data is good for both numerical models.

  10. ANL site response for the DOE FY1994 information resources management long-range plan

    Energy Technology Data Exchange (ETDEWEB)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory's ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory's previous submissions. The response contains both narrative and tabular material. It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory's Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.

  11. ANL site response for the DOE FY1994 information resources management long-range plan

    Energy Technology Data Exchange (ETDEWEB)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory`s ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory`s previous submissions. The response contains both narrative and tabular material. It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory`s Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, ``Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.

  12. Design an optimal controller for nuclear reactor using a digital computer

    International Nuclear Information System (INIS)

    Saleh, F.M.A.

    1986-01-01

    An attempt is carried out to design an optimal controller, for a model nuclear reactor at one hand, and a model nuclear power plant at another hand using a digital computer. The design philosophy adopted was to specify the system dynamics in terms of a desired system transfer function, and realizing the design synthesis through state-variable feedback technique, thus ensuring both stability and optimization in the state space sense. The control design was also tested by carrying out digital simulation transient response runs (step, ramp, impulse, etc.) and agreement between the predicted desirable response and actual response of the overall design was achieved. Furthermore the performance of the controller is verified against a reference non-linear model for purposes of assessing the accuracy of the linearized approximation model. The results show that state-variable feedback policy can rank as an effective optimal technique for designing control algorithm for an on-line computer of a nuclear power plant. 41 figs. 43 refs

  13. Non-linear least squares curve fitting of a simple theoretical model to radioimmunoassay dose-response data using a mini-computer

    International Nuclear Information System (INIS)

    Wilkins, T.A.; Chadney, D.C.; Bryant, J.; Palmstroem, S.H.; Winder, R.L.

    1977-01-01

    Using the simple univalent antigen univalent-antibody equilibrium model the dose-response curve of a radioimmunoassay (RIA) may be expressed as a function of Y, X and the four physical parameters of the idealised system. A compact but powerful mini-computer program has been written in BASIC for rapid iterative non-linear least squares curve fitting and dose interpolation with this function. In its simplest form the program can be operated in an 8K byte mini-computer. The program has been extensively tested with data from 10 different assay systems (RIA and CPBA) for measurement of drugs and hormones ranging in molecular size from thyroxine to insulin. For each assay system the results have been analysed in terms of (a) curve fitting biases and (b) direct comparison with manual fitting. In all cases the quality of fitting was remarkably good in spite of the fact that the chemistry of each system departed significantly from one or more of the assumptions implicit in the model used. A mathematical analysis of departures from the model's principal assumption has provided an explanation for this somewhat unexpected observation. The essential features of this analysis are presented in this paper together with the statistical analyses of the performance of the program. From these and the results obtained to date in the routine quality control of these 10 assays, it is concluded that the method of curve fitting and dose interpolation presented in this paper is likely to be of general applicability. (orig.) [de

  14. Computed tomographic detection of sinusitis responsible for intracranial and extracranial infections

    International Nuclear Information System (INIS)

    Carter, B.L.; Bankoff, M.S.; Fisk, J.D.

    1983-01-01

    Computed tomography (CT) is now used extensively for the evaluation of orbital, facial, and intracranial infections. Nine patients are presented to illustrate the importance of detecting underlying and unsuspected sinusitis. Prompt treatment of the sinusitis is essential to minimize the morbidity and mortality associated with complications such as brain abscess, meningitis, orbital cellulitis, and osteomyelitis. A review of the literature documents the persistence of these complications despite the widespread use of antibiotic therapy. Recognition of the underlying sinusitis is now possible with CT if the region of the sinuses is included and bone-window settings are used during the examination of patients with orbital and intracranial infection

  15. Nuclear forces and high-performance computing: The perfect match

    International Nuclear Information System (INIS)

    Luu, T; Walker-Loud, A

    2009-01-01

    High-performance computing is now enabling the calculation of certain hadronic interaction parameters directly from Quantum Chromodynamics, the quantum field theory that governs the behavior of quarks and gluons and is ultimately responsible for the nuclear strong force. In this paper we briefly describe the state of the field and show how other aspects of hadronic interactions will be ascertained in the near future. We give estimates of computational requirements needed to obtain these goals, and outline a procedure for incorporating these results into the broader nuclear physics community.

  16. Presence and biofeedback in first-person perspective computer games

    DEFF Research Database (Denmark)

    Grimshaw-Aagaard, Mark Nicholas

    2019-01-01

    . Following the line taken by presence theorists, I differentiate between immersion, an objective measure such that computer game technology can be less or more immersive, and presence, a subjective, human response to that technology. The third section looks at current possibilities for biofeedback...... in relation to sound design for first-person perspective computer games; in line with the first section, biofeedback devices are treated as an immersive technology. I close the chapter by suggesting ways in which sound design in such games might make use of biofeedback to enhance the perception of presence...

  17. Impulse response measurements with an off-line cross correlator

    International Nuclear Information System (INIS)

    Corran, E.R.; Cummins, J.D.

    1963-11-01

    The impulse responses of simulated systems have been computed by off-line cross-correlation of the system input and output signals. The input test signal consisted of a discrete interval binary code whose autocorrelation was a triangular pulse at zero lag. The main object of the experiments was to study the inaccuracies introduced in ideal, noise free systems by determining the impulse response digitally from sampled versions of the system input and output signals. A second object was to determine the error introduced by adding controlled amounts of uncorrelated noise at the system outputs. The experimental results showed that for signal to noise ratios greater than 10:1 in the mean square sense, the impulse responses may be determined with reasonable accuracy using only one cycle of the binary code. The method lends itself to on-line computation of system impulse responses. The latter could be used to monitor the stability of the system or to determine control parameters in an adaptive control system. (author)

  18. Impulse response measurements with an off-line cross correlator

    Energy Technology Data Exchange (ETDEWEB)

    Corran, E R; Cummins, J D [Dynamics Group, Control and Instrumentation Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1963-11-15

    The impulse responses of simulated systems have been computed by off-line cross-correlation of the system input and output signals. The input test signal consisted of a discrete interval binary code whose autocorrelation was a triangular pulse at zero lag. The main object of the experiments was to study the inaccuracies introduced in ideal, noise free systems by determining the impulse response digitally from sampled versions of the system input and output signals. A second object was to determine the error introduced by adding controlled amounts of uncorrelated noise at the system outputs. The experimental results showed that for signal to noise ratios greater than 10:1 in the mean square sense, the impulse responses may be determined with reasonable accuracy using only one cycle of the binary code. The method lends itself to on-line computation of system impulse responses. The latter could be used to monitor the stability of the system or to determine control parameters in an adaptive control system. (author)

  19. Globalisation, Responsibility and Virtual Schools

    Science.gov (United States)

    Russell, Glenn

    2006-01-01

    The intersection of globalisation and information technology influences ethical positions and notions of responsibility within businesses and in distance education for school students. As the spatial and temporal distance between student and teacher increases, and is mediated by computers, there have been changes to the ways in which individuals…

  20. Muscle fatigue in relation to forearm pain and tenderness among professional computer users

    DEFF Research Database (Denmark)

    Thomsen, GF; Johnson, PW; Svendsen, Susanne Wulff

    2007-01-01

    ABSTRACT: BACKGROUND: To examine the hypothesis that forearm pain with palpation tenderness in computer users is associated with increased extensor muscle fatigue. METHODS: Eighteen persons with pain and moderate to severe palpation tenderness in the extensor muscle group of the right forearm...... response was not explained by differences in the MVC or body mass index. CONCLUSION: Computer users with forearm pain and moderate to severe palpation tenderness had diminished forearm extensor muscle fatigue response. Additional studies are necessary to determine whether this result reflects an adaptive...... and twenty gender and age matched referents without such complaints were enrolled from the Danish NUDATA study of neck and upper extremity disorders among technical assistants and machine technicians. Fatigue of the right forearm extensor muscles was assessed by muscle twitch forces in response to low...

  1. Response variance in functional maps: neural darwinism revisited.

    Directory of Open Access Journals (Sweden)

    Hirokazu Takahashi

    Full Text Available The mechanisms by which functional maps and map plasticity contribute to cortical computation remain controversial. Recent studies have revisited the theory of neural Darwinism to interpret the learning-induced map plasticity and neuronal heterogeneity observed in the cortex. Here, we hypothesize that the Darwinian principle provides a substrate to explain the relationship between neuron heterogeneity and cortical functional maps. We demonstrate in the rat auditory cortex that the degree of response variance is closely correlated with the size of its representational area. Further, we show that the response variance within a given population is altered through training. These results suggest that larger representational areas may help to accommodate heterogeneous populations of neurons. Thus, functional maps and map plasticity are likely to play essential roles in Darwinian computation, serving as effective, but not absolutely necessary, structures to generate diverse response properties within a neural population.

  2. Response variance in functional maps: neural darwinism revisited.

    Science.gov (United States)

    Takahashi, Hirokazu; Yokota, Ryo; Kanzaki, Ryohei

    2013-01-01

    The mechanisms by which functional maps and map plasticity contribute to cortical computation remain controversial. Recent studies have revisited the theory of neural Darwinism to interpret the learning-induced map plasticity and neuronal heterogeneity observed in the cortex. Here, we hypothesize that the Darwinian principle provides a substrate to explain the relationship between neuron heterogeneity and cortical functional maps. We demonstrate in the rat auditory cortex that the degree of response variance is closely correlated with the size of its representational area. Further, we show that the response variance within a given population is altered through training. These results suggest that larger representational areas may help to accommodate heterogeneous populations of neurons. Thus, functional maps and map plasticity are likely to play essential roles in Darwinian computation, serving as effective, but not absolutely necessary, structures to generate diverse response properties within a neural population.

  3. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  4. Activities of the Research Institute for Advanced Computer Science

    Science.gov (United States)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  5. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  6. Transcription-based prediction of response to IFNbeta using supervised computational methods.

    Directory of Open Access Journals (Sweden)

    Sergio E Baranzini

    2005-01-01

    Full Text Available Changes in cellular functions in response to drug therapy are mediated by specific transcriptional profiles resulting from the induction or repression in the activity of a number of genes, thereby modifying the preexisting gene activity pattern of the drug-targeted cell(s. Recombinant human interferon beta (rIFNbeta is routinely used to control exacerbations in multiple sclerosis patients with only partial success, mainly because of adverse effects and a relatively large proportion of nonresponders. We applied advanced data-mining and predictive modeling tools to a longitudinal 70-gene expression dataset generated by kinetic reverse-transcription PCR from 52 multiple sclerosis patients treated with rIFNbeta to discover higher-order predictive patterns associated with treatment outcome and to define the molecular footprint that rIFNbeta engraves on peripheral blood mononuclear cells. We identified nine sets of gene triplets whose expression, when tested before the initiation of therapy, can predict the response to interferon beta with up to 86% accuracy. In addition, time-series analysis revealed potential key players involved in a good or poor response to interferon beta. Statistical testing of a random outcome class and tolerance to noise was carried out to establish the robustness of the predictive models. Large-scale kinetic reverse-transcription PCR, coupled with advanced data-mining efforts, can effectively reveal preexisting and drug-induced gene expression signatures associated with therapeutic effects.

  7. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  8. Application of computer assisted tomography in gynaecological oncology

    International Nuclear Information System (INIS)

    Pickel, H.; Schreithofer, H.; Sager, W.D.; Graz Univ.

    1980-01-01

    The non invasive radiologic technique of computed tomography has been employed since 1978 at the University Women's Clinic and Radiologic Clinic, Graz. One hundred and fourty six examinations of the pelvis, abdomen and chest were performed on 63 oncologic patients. The method was employed for the preoperative detection and measurement of the size of benign and malignant neoplasms; in tumour staging and assessment of therapeutic response. The results suggest that CT might be the best method for the assessment of response to cytotoxic therapy of ovarian cancer. (orig.) [de

  9. Computer Security: Cryptography and authentication (2/4)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Remi Mollon studied computer security at University and he first worked on Grids, with the EGEE project, for a French Bioinformatics institute. Information security being crucial in that field, he developed an encrypted file management system on top of Grid middleware, and he contributed in integrating legacy applications with Grids. Then, he was hired by CERN as a Grid Data Management developer, and he joined the Grid Operational Security Coordination Team. Remi has now moved to CERN Computer Security Team. Remi is involved in the daily security operations, in addition to be responsible to design Team's computer infrastructure, and to participate to several projects, like multi-factor authentication at CERN. With the prevalence of modern information technologies and its increasing integration into our daily live, digital systems become more and more playground for evil people. While in the past, attacks were driven by fame& kudos, nowadays money is the motivating factor. Just the recent months have s...

  10. Computational Methods in Stochastic Dynamics Volume 2

    CERN Document Server

    Stefanou, George; Papadopoulos, Vissarion

    2013-01-01

    The considerable influence of inherent uncertainties on structural behavior has led the engineering community to recognize the importance of a stochastic approach to structural problems. Issues related to uncertainty quantification and its influence on the reliability of the computational models are continuously gaining in significance. In particular, the problems of dynamic response analysis and reliability assessment of structures with uncertain system and excitation parameters have been the subject of continuous research over the last two decades as a result of the increasing availability of powerful computing resources and technology.   This book is a follow up of a previous book with the same subject (ISBN 978-90-481-9986-0) and focuses on advanced computational methods and software tools which can highly assist in tackling complex problems in stochastic dynamic/seismic analysis and design of structures. The selected chapters are authored by some of the most active scholars in their respective areas and...

  11. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  12. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  13. Aquatic emergency response model at the Savannah River Plant

    International Nuclear Information System (INIS)

    Hayes, D.W.

    1987-01-01

    The Savannah River Plant emergency response plans include a stream/river emergency response model to predict travel times, maximum concentrations, and concentration distributions as a function of time at selected downstream/river locations from each of the major SRP installations. The menu driven model can be operated from any of the terminals that are linked to the real-time computer monitoring system for emergency response

  14. Linear response coupled cluster theory with the polarizable continuum model within the singles approximation for the solvent response

    Science.gov (United States)

    Caricato, Marco

    2018-04-01

    We report the theory and the implementation of the linear response function of the coupled cluster (CC) with the single and double excitations method combined with the polarizable continuum model of solvation, where the correlation solvent response is approximated with the perturbation theory with energy and singles density (PTES) scheme. The singles name is derived from retaining only the contribution of the CC single excitation amplitudes to the correlation density. We compare the PTES working equations with those of the full-density (PTED) method. We then test the PTES scheme on the evaluation of excitation energies and transition dipoles of solvated molecules, as well as of the isotropic polarizability and specific rotation. Our results show a negligible difference between the PTED and PTES schemes, while the latter affords a significantly reduced computational cost. This scheme is general and can be applied to any solvation model that includes mutual solute-solvent polarization, including explicit models. Therefore, the PTES scheme is a competitive approach to compute response properties of solvated systems using CC methods.

  15. Programmable full-adder computations in communicating three-dimensional cell cultures.

    Science.gov (United States)

    Ausländer, David; Ausländer, Simon; Pierrat, Xavier; Hellmann, Leon; Rachid, Leila; Fussenegger, Martin

    2018-01-01

    Synthetic biologists have advanced the design of trigger-inducible gene switches and their assembly into input-programmable circuits that enable engineered human cells to perform arithmetic calculations reminiscent of electronic circuits. By designing a versatile plug-and-play molecular-computation platform, we have engineered nine different cell populations with genetic programs, each of which encodes a defined computational instruction. When assembled into 3D cultures, these engineered cell consortia execute programmable multicellular full-adder logics in response to three trigger compounds.

  16. Equivalency of Paper versus Tablet Computer Survey Data

    Science.gov (United States)

    Ravert, Russell D.; Gomez-Scott, Jessica; Donnellan, M. Brent

    2015-01-01

    Survey responses collected via paper surveys and computer tablets were compared to test for differences between those methods of obtaining self-report data. College students (N = 258) were recruited in public campus locations and invited to complete identical surveys on either paper or iPad tablet. Only minor homogeneity differences were found…

  17. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  18. Digital tomosynthesis parallel imaging computational analysis with shift and add and back projection reconstruction algorithms.

    Science.gov (United States)

    Chen, Ying; Balla, Apuroop; Rayford II, Cleveland E; Zhou, Weihua; Fang, Jian; Cong, Linlin

    2010-01-01

    Digital tomosynthesis is a novel technology that has been developed for various clinical applications. Parallel imaging configuration is utilised in a few tomosynthesis imaging areas such as digital chest tomosynthesis. Recently, parallel imaging configuration for breast tomosynthesis began to appear too. In this paper, we present the investigation on computational analysis of impulse response characterisation as the start point of our important research efforts to optimise the parallel imaging configurations. Results suggest that impulse response computational analysis is an effective method to compare and optimise imaging configurations.

  19. Attitudes of health care students about computer-aided neuroanatomy instruction.

    Science.gov (United States)

    McKeough, D Michael; Bagatell, Nancy

    2009-01-01

    This study examined students' attitudes toward computer-aided instruction (CAI), specifically neuroanatomy learning modules, to assess which components were primary in establishing these attitudes and to discuss the implications of these attitudes for successfully incorporating CAI in the preparation of health care providers. Seventy-seven masters degree, entry-level, health care professional students matriculated in an introductory neuroanatomy course volunteered as subjects for this study. Students independently reviewed the modules as supplements to lecture and completed a survey to evaluate teaching effectiveness. Responses to survey statements were compared across the learning modules to determine if students viewed the modules differently. Responses to individual survey statements were averaged to measure the strength of agreement or disagreement with the statement. Responses to open-ended questions were theme coded, and frequencies and percentages were calculated for each. Students saw no differences between the learning modules. Students perceived the learning modules as valuable; they enjoyed using the modules but did not prefer CAI over traditional lecture format. The modules were useful in learning or reinforcing neuroanatomical concepts and improving clinical problem-solving skills. Students reported that the visual representation of the neuroanatomical systems, computer animation, ability to control the use of the modules, and navigational fidelity were key factors in determining attitudes. The computer-based learning modules examined in this study were effective as adjuncts to lecture in helping entry-level health care students learn and make clinical applications of neuroanatomy information.

  20. DrugSig: A resource for computational drug repositioning utilizing gene expression signatures.

    Directory of Open Access Journals (Sweden)

    Hongyu Wu

    Full Text Available Computational drug repositioning has been proved as an effective approach to develop new drug uses. However, currently existing strategies strongly rely on drug response gene signatures which scattered in separated or individual experimental data, and resulted in low efficient outputs. So, a fully drug response gene signatures database will be very helpful to these methods. We collected drug response microarray data and annotated related drug and targets information from public databases and scientific literature. By selecting top 500 up-regulated and down-regulated genes as drug signatures, we manually established the DrugSig database. Currently DrugSig contains more than 1300 drugs, 7000 microarray and 800 targets. Moreover, we developed the signature based and target based functions to aid drug repositioning. The constructed database can serve as a resource to quicken computational drug repositioning. Database URL: http://biotechlab.fudan.edu.cn/database/drugsig/.

  1. Noise-constrained switching times for heteroclinic computing

    Science.gov (United States)

    Neves, Fabio Schittler; Voit, Maximilian; Timme, Marc

    2017-03-01

    Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states—the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for

  2. Static Memory Deduplication for Performance Optimization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Gangyong Jia

    2017-04-01

    Full Text Available In a cloud computing environment, the number of virtual machines (VMs on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.

  3. Static Memory Deduplication for Performance Optimization in Cloud Computing.

    Science.gov (United States)

    Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan

    2017-04-27

    In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.

  4. A Feasibility Study of Implementing a Bring-Your-Own-Computing-Device Policy

    Science.gov (United States)

    2013-12-01

    telecom charges is applicable to a corporate environment that allows for telecommuting or where employees require data access to their devices while...do not want to try to control their students’ computers, but the focus of BYOD in education is generally on educational outcomes (Sweeney, 2012). C...of the computer system, while application software is responsible for controlling the specific command tasks. Therefore, the relationship between

  5. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  6. WE-FG-207B-02: Material Reconstruction for Spectral Computed Tomography with Detector Response Function

    International Nuclear Information System (INIS)

    Liu, J; Gao, H

    2016-01-01

    Purpose: Different from the conventional computed tomography (CT), spectral CT based on energy-resolved photon-counting detectors is able to provide the unprecedented material composition. However, an important missing piece for accurate spectral CT is to incorporate the detector response function (DRF), which is distorted by factors such as pulse pileup and charge-sharing. In this work, we propose material reconstruction methods for spectral CT with DRF. Methods: The polyenergetic X-ray forward model takes the DRF into account for accurate material reconstruction. Two image reconstruction methods are proposed: a direct method based on the nonlinear data fidelity from DRF-based forward model; a linear-data-fidelity based method that relies on the spectral rebinning so that the corresponding DRF matrix is invertible. Then the image reconstruction problem is regularized with the isotropic TV term and solved by alternating direction method of multipliers. Results: The simulation results suggest that the proposed methods provided more accurate material compositions than the standard method without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Conclusion: We have proposed material reconstruction methods for spectral CT with DRF, whichprovided more accurate material compositions than the standard methods without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Jiulong Liu and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).

  7. A general digital computer procedure for synthesizing linear automatic control systems

    International Nuclear Information System (INIS)

    Cummins, J.D.

    1961-10-01

    The fundamental concepts required for synthesizing a linear automatic control system are considered. A generalized procedure for synthesizing automatic control systems is demonstrated. This procedure has been programmed for the Ferranti Mercury and the IBM 7090 computers. Details of the programmes are given. The procedure uses the linearized set of equations which describe the plant to be controlled as the starting point. Subsequent computations determine the transfer functions between any desired variables. The programmes also compute the root and phase loci for any linear (and some non-linear) configurations in the complex plane, the open loop and closed loop frequency responses of a system, the residues of a function of the complex variable 's' and the time response corresponding to these residues. With these general programmes available the design of 'one point' automatic control systems becomes a routine scientific procedure. Also dynamic assessments of plant may be carried out. Certain classes of multipoint automatic control problems may also be solved with these procedures. Autonomous systems, invariant systems and orthogonal systems may also be studied. (author)

  8. A Quantitative Investigation of Cloud Computing Adoption in Nigeria: Testing an Enhanced Technology Acceptance Model

    Science.gov (United States)

    Ishola, Bashiru Abayomi

    2017-01-01

    Cloud computing has recently emerged as a potential alternative to the traditional on-premise computing that businesses can leverage to achieve operational efficiencies. Consequently, technology managers are often tasked with the responsibilities to analyze the barriers and variables critical to organizational cloud adoption decisions. This…

  9. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  10. Computational modeling of heterogeneity and function of CD4+ T cells

    Directory of Open Access Journals (Sweden)

    Adria eCarbo

    2014-07-01

    Full Text Available The immune system is composed of many different cell types and hundreds of intersecting molecular pathways and signals. This large biological complexity requires coordination between distinct pro-inflammatory and regulatory cell subsets to respond to infection while maintaining tissue homeostasis. CD4+ T cells play a central role in orchestrating immune responses and in maintaining a balance between pro- and anti- inflammatory responses. This tight balance between regulatory and effector reactions depends on the ability of CD4+ T cells to modulate distinct pathways within large molecular networks, since dysregulated CD4+ T cell responses may result in chronic inflammatory and autoimmune diseases. The CD4+ T cell differentiation process comprises an intricate interplay between cytokines, their receptors, adaptor molecules, signaling cascades and transcription factors that help delineate cell fate and function. Computational modeling can help to describe, simulate, analyze, and predict some of the behaviors in this complicated differentiation network. This review provides a comprehensive overview of existing computational immunology methods as well as novel strategies used to model immune responses with a particular focus on CD4+ T cell differentiation.

  11. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  12. Integrin-Targeted Hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography for Imaging Tumor Progression and Early Response in Non-Small Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Xiaopeng Ma

    2017-01-01

    Full Text Available Integrins play an important role in tumor progression, invasion and metastasis. Therefore we aimed to evaluate a preclinical imaging approach applying ανβ3 integrin targeted hybrid Fluorescence Molecular Tomography/X-ray Computed Tomography (FMT-XCT for monitoring tumor progression as well as early therapy response in a syngeneic murine Non-Small Cell Lung Cancer (NSCLC model. Lewis Lung Carcinomas were grown orthotopically in C57BL/6 J mice and imaged in-vivo using a ανβ3 targeted near-infrared fluorescence (NIRF probe. ανβ3-targeted FMT-XCT was able to track tumor progression. Cilengitide was able to substantially block the binding of the NIRF probe and suppress the imaging signal. Additionally mice were treated with an established chemotherapy regimen of Cisplatin and Bevacizumab or with a novel MEK inhibitor (Refametinib for 2 weeks. While μCT revealed only a moderate slowdown of tumor growth, ανβ3 dependent signal decreased significantly compared to non-treated mice already at one week post treatment. ανβ3 targeted imaging might therefore become a promising tool for assessment of early therapy response in the future.

  13. Estimating Derived Response Levels at the Savannah River Site for Use with Emergency Response Models

    International Nuclear Information System (INIS)

    Simpkins, A.A.

    2002-01-01

    Emergency response computer models at the Savannah River Site (SRS) are coupled with real-time meteorological data to estimate dose to individuals downwind of accidental radioactive releases. Currently, these models estimate doses for inhalation and shine pathways, but do not consider dose due to ingestion of contaminated food products. The Food and Drug Administration (FDA) has developed derived intervention levels (DIL) which refer to the radionuclide-specific concentration in food present throughout the relevant period of time, with no intervention, that could lead to an individual receiving a radiation dose equal to the protective action guide. In the event of an emergency, concentrations in various food types are compared with these levels to make interdictions decisions. Prior to monitoring results being available, concentrations in the environmental media (i.e. soil), called derived response levels (DRLs), can be estimated from the DILs and directly compared with computer output to provide preliminary guidance as to whether intervention is necessary. Site-specific derived response levels (DRLs) are developed for ingestion pathways pertinent to SRS: milk, meat, fish, grain, produce, and beverage. This provides decision-makers with an additional tool for use immediately following an accident prior to the acquisition of food monitoring data

  14. Computer-Aided dispatching system design specification

    International Nuclear Information System (INIS)

    Briggs, M.G.

    1996-01-01

    This document defines the performance requirements for a graphic display dispatching system to support Hanford Patrol emergency response. This system is defined as a Commercial-Off the-Shelf computer dispatching system providing both text and graphical display information while interfacing with the diverse reporting system within the Hanford Facility. This system also provided expansion capabilities to integrate Hanford Fire and the Occurrence Notification Center and provides back-up capabilities for the Plutonium Processing Facility

  15. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  16. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  17. Scintillation response of organic and inorganic scintillators

    CERN Document Server

    Papadopoulos, L M

    1999-01-01

    A method to evaluate the scintillation response of organic and inorganic scintillators to different heavy ionizing particles is suggested. A function describing the rate of the energy consumed as fluorescence emission is derived, i.e., the differential response with respect to time. This function is then integrated for each ion and scintillator (anthracene, stilbene and CsI(Tl)) to determine scintillation response. The resulting scintillation responses are compared to the previously reported measured responses. Agreement to within 2.5% is observed when these data are normalized to each other. In addition, conclusions regarding the quenching parameter kB dependence on the type of the particle and the computed values of kB for certain ions are included. (author)

  18. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Science.gov (United States)

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  19. Investigating Nigerian Primary School Teachers' Preparedness to Adopt Personal Response System in ESL Classroom

    Science.gov (United States)

    Agbatogun, Alaba Olaoluwakotansibe

    2012-01-01

    This study investigated the extent to which computer literacy dimensions (computer general knowledge, documents and documentations, communication and surfing as well as data inquiry), computer use and academic qualification as independent variables predicted primary school teachers' attitude towards the integration of Personal Response System in…

  20. Regional Platform on Personal Computer Electronic Waste in Latin ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Regional Platform on Personal Computer Electronic Waste in Latin America and the Caribbean. Donation of ... This project aims to identify environmentally responsible and sustainable solutions to the problem of e-waste. ... Policy in Focus publishes a special issue profiling evidence to empower women in the labour market.

  1. Using computer simulations to improve concept formation in chemistry

    African Journals Online (AJOL)

    The goal of this research project was to investigate whether computer simulations used as a visually-supporting teaching strategy, can improve concept formation with regard to molecules and chemical bonding, as found in water. Both the qualitative and quantitative evaluation of responses supported the positive outcome ...

  2. Computational models as predictors of HIV treatment outcomes for ...

    African Journals Online (AJOL)

    Background: Selecting the optimal combination of HIV drugs for an individual in resourcelimited settings is challenging because of the limited availability of drugs and genotyping. Objective: The evaluation as a potential treatment support tool of computational models that predict response to therapy without a genotype, ...

  3. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  4. Computer Pure-Tone and Operator Stress: Report III.

    Science.gov (United States)

    Dow, Caroline; Covert, Douglas C.

    Pure-tone sound at 15,750 Herz generated by flyback transformers in many computer and video display terminal (VDT) monitors has stress-related productivity effects in some operators, especially women. College-age women in a controlled experiment simulating half a normal work day showed responses within the first half hour of exposure to a tone…

  5. Diagnosis of bilateral adrenocortical hemorrhage by computed tomography

    International Nuclear Information System (INIS)

    Liu, L.; Haskin, M.E.; Rose, L.I.; Bemis, C.E.

    1982-01-01

    Adrenocortical hemorrhage has been diagnosed on the basis of the clinical presentation and response to steroids or autopsy findings. Prompt recognition of the disease has been difficult because of its similarity to other disorders. We report the diagnosis of a bilateral adrenocortical hemorrhage by computed tomography (CT), followed by biochemical confirmation of the diagnosis

  6. Plasmonic nanostructures: local versus nonlocal response

    DEFF Research Database (Denmark)

    Toscano, Giuseppe; Wubs, Martijn; Xiao, Sanshui

    2010-01-01

    , and hence it is sensitive to possible narrow resonances that may arise due to strong electronic quantum confinement in the metal. This feature allows us to accurately determine which geometries are strongly affected by nonlocal response, for example regarding applications based on electric field enhancement......We study the importance of taking the nonlocal optical response of metals into account for accurate determination of optical properties of nanoplasmonic structures. Here we focus on the computational physics aspects of this problem, and in particular we report on the nonlocal-response package...... that we wrote for state-of the art numerical software, enabling us to take into account the nonlocal material response of metals for any arbitrarily shaped nanoplasmonic structures, without much numerical overhead as compared to the standard local response. Our method is a frequency-domain method...

  7. Picture processing computer to control movement by computer provided vision

    Energy Technology Data Exchange (ETDEWEB)

    Graefe, V

    1983-01-01

    The author introduces a multiprocessor system which has been specially developed to enable mechanical devices to interpret pictures presented in real time. The separate processors within this system operate simultaneously and independently. By means of freely moveable windows the processors can concentrate on those parts of the picture that are relevant to the control problem. If a machine is to make a correct response to its observation of a picture of moving objects, it must be able to follow the picture sequence, step by step, in real time. As the usual serially operating processors are too slow for such a task, the author describes three models of a special picture processing computer which it has been necessary to develop. 3 references.

  8. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    Science.gov (United States)

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  9. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    California to date. The Titan system provides the largest extant heterogeneous architecture for computing and computational science. Usage is high, delivering on the promise of a system well-suited for capability simulations for science. This success is due in part to innovations in tracking and reporting the activity on the compute nodes, and using this information to further enable and optimize applications, extending and balancing workload across the entire node. The OLCF continues to invest in innovative processes, tools, and resources necessary to meet continuing user demand. The facility’s leadership in data analysis and workflows was featured at the Department of Energy (DOE) booth at SC15, for the second year in a row, highlighting work with researchers from the National Library of Medicine coupled with unique computational and data resources serving experimental and observational data across facilities. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. Building on the exemplary year of 2014, as shown by the 2014 Operational Assessment Report (OAR) review committee response in Appendix A, this OAR delineates the policies, procedures, and innovations implemented by the OLCF to continue delivering a multi-petaflop resource for cutting-edge research. This report covers CY 2015, which, unless otherwise specified, denotes January 1, 2015, through December 31, 2015.

  10. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  11. Definition, modeling and simulation of a grid computing system for high throughput computing

    CERN Document Server

    Caron, E; Tsaregorodtsev, A Yu

    2006-01-01

    In this paper, we study and compare grid and global computing systems and outline the benefits of having an hybrid system called dirac. To evaluate the dirac scheduling for high throughput computing, a new model is presented and a simulator was developed for many clusters of heterogeneous nodes belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. Next, we do the comparison with a real batch system and we obtain an average error of 10.5% for the response time and 12% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called dirac in a high throughput context. We justify our decentralized, adaptive and oppor! tunistic approach in comparison to a centralize...

  12. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  13. Close the Gate, Lock the Windows, Bolt the Doors: Securing Library Computers. Online Treasures

    Science.gov (United States)

    Balas, Janet

    2005-01-01

    This article, written by a systems librarian at the Monroeville Public Library, discusses a major issue affecting all computer users, security. It indicates that while, staying up-to-date on the latest security issues has become essential for all computer users, it's more critical for network managers who are responsible for securing computer…

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  15. Music and natural sounds in an auditory steady-state response based brain-computer interface to increase user acceptance.

    Science.gov (United States)

    Heo, Jeong; Baek, Hyun Jae; Hong, Seunghyeok; Chang, Min Hye; Lee, Jeong Su; Park, Kwang Suk

    2017-05-01

    Patients with total locked-in syndrome are conscious; however, they cannot express themselves because most of their voluntary muscles are paralyzed, and many of these patients have lost their eyesight. To improve the quality of life of these patients, there is an increasing need for communication-supporting technologies that leverage the remaining senses of the patient along with physiological signals. The auditory steady-state response (ASSR) is an electro-physiologic response to auditory stimulation that is amplitude-modulated by a specific frequency. By leveraging the phenomenon whereby ASSR is modulated by mind concentration, a brain-computer interface paradigm was proposed to classify the selective attention of the patient. In this paper, we propose an auditory stimulation method to minimize auditory stress by replacing the monotone carrier with familiar music and natural sounds for an ergonomic system. Piano and violin instrumentals were employed in the music sessions; the sounds of water streaming and cicadas singing were used in the natural sound sessions. Six healthy subjects participated in the experiment. Electroencephalograms were recorded using four electrodes (Cz, Oz, T7 and T8). Seven sessions were performed using different stimuli. The spectral power at 38 and 42Hz and their ratio for each electrode were extracted as features. Linear discriminant analysis was utilized to classify the selections for each subject. In offline analysis, the average classification accuracies with a modulation index of 1.0 were 89.67% and 87.67% using music and natural sounds, respectively. In online experiments, the average classification accuracies were 88.3% and 80.0% using music and natural sounds, respectively. Using the proposed method, we obtained significantly higher user-acceptance scores, while maintaining a high average classification accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Date Sensitive Computing Problems: Understanding the Threat

    Science.gov (United States)

    1998-08-29

    equipment on Earth.3 It can also interfere with electromagnetic signals from such devices as cell phones, radio, televison , and radar. By itself, the ...spacecraft. Debris from impacted satellites will add to the existing orbital debris problem, and could eventually cause damage to other satellites...Date Sensitive Computing Problems Understanding the Threat Aug. 17, 1998 Revised Aug. 29, 1998 Prepared by: The National Crisis Response

  17. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  18. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  19. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  20. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  1. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  2. Electrostatic resonances and optical responses of cylindrical clusters

    International Nuclear Information System (INIS)

    Choy, C W; Xiao, J J; Yu, K W

    2008-01-01

    We developed a Green function formalism (GFF) for computing the electrostatic resonance in clusters of cylindrical particles. In the GFF, we take advantage of a surface integral equation to avoid matching the complicated boundary conditions on the surfaces of the particles. Numerical solutions of the eigenvalue equation yield a pole spectrum in the spectral representation. The pole spectrum can in turn be used to compute the optical response of these particles. For two cylindrical particles, the results are in excellent agreement with the exact results from the multiple image method and the normal mode expansion method. The results of this work can be extended to investigate the enhanced nonlinear optical responses of metal-dielectric composites, as well as optical switching in plasmonic waveguides.

  3. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  4. Secure authentication mechanisms for the management interface in cloud computing environments

    OpenAIRE

    Soares, Liliana Filipa Baptista

    2013-01-01

    For a handful of years, cloud computing has been a hot catchphrase. The industry has massively adopted it and the academia is focusing on improving the technology, which has been evolving at a quick pace. The cloud computing paradigm consists in adopting solutions provisioned by some cloud providers that are hosted on data centers. Customers are therefore tied to those third-party entities, since they becomes involved in their businesses for being responsible for the Information Technologi...

  5. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  6. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  7. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  8. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  9. Consolidation of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Di Girolamo, Alessandro; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall

    2016-01-01

    Throughout the first year of LHC Run 2, ATLAS Cloud Computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS Cloud Computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vac resources, streamlined usage of the High Level Trigger cloud for simulation and reconstruction, extreme scaling on Amazon EC2, and procurement of commercial cloud capacity in Europe. Building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems. ...

  10. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  11. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  12. Partitioned Fluid-Structure Interaction for Full Rotor Computations Using CFD

    DEFF Research Database (Denmark)

    Heinz, Joachim Christian

    ) based aerodynamic model which is computationally cheap but includes several limitations and corrections in order to account for three-dimensional and unsteady eects. The present work discusses the development of an aero-elastic simulation tool where high-fidelity computational fluid dynamics (CFD......) is used to model the aerodynamics of the flexible wind turbine rotor. Respective CFD computations are computationally expensive but do not show the limitations of the BEM-based models. It is one of the first times that high-fidelity fluid-structure interaction (FSI) simulations are used to model the aero......-elastic response of an entire wind turbine rotor. The work employs a partitioned FSI coupling between the multi-body-based structural model of the aero-elastic solver HAWC2 and the finite volume CFD solver EllipSys3D. In order to establish an FSI coupling of sufficient time accuracy and sufficient numerical...

  13. Evaluation of the SCANAIR Computer Code

    International Nuclear Information System (INIS)

    Jernkvist, Lars Olof; Massih, Ali

    2001-11-01

    The SCANAIR computer code, version 3.2, has been evaluated from the standpoint of its capability to analyze, simulate and predict nuclear fuel behavior during severe power transients. SCANAIR calculates the thermal and mechanical behavior of a pressurized water reactor (PWR) fuel rod during a postulated reactivity initiated accident (RIA), and our evaluation indicates that SCANAIR is a state of the art computational tool for this purpose. Our evaluation starts by reviewing the basic theoretical models in SCANAIR, namely the governing equations for heat transfer, the mechanical response of fuel and clad, and the fission gas release behavior. The numerical methods used to solve the governing equations are briefly reviewed, and the range of applicability of the models and their limitations are discussed and illustrated with examples. Next, the main features of the SCANAIR user interface are delineated. The code requires an extensive amount of input data, in order to define burnup-dependent initial conditions to the simulated RIA. These data must be provided in a special format by a thermal-mechanical fuel rod analysis code. The user also has to supply the transient power history under RIA as input, which requires a code for neutronics calculation. The programming structure and documentation of the code are also addressed in our evaluation. SCANAIR is programmed in Fortran-77, and makes use of several general Fortran-77 libraries for handling input/output, data storage and graphical presentation of computed results. The documentation of SCANAIR and its helping libraries is generally of good quality. A drawback with SCANAIR in its present form, is that the code and its pre- and post-processors are tied to computers running the Unix or Linux operating systems. As part of our evaluation, we have performed a large number of computations with SCANAIR, some of which are documented in this report. The computations presented here include a hypothetical RIA in a high

  14. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  15. NECAP 4.1: NASA's Energy Cost Analysis Program thermal response factor routine

    Science.gov (United States)

    Weise, M. R.

    1982-08-01

    A thermal response factor is described and calculation sequences and flowcharts for RESFAC2 are provided. RESFAC is used by NASA's (NECAP) to calculate hourly heat transfer coefficients (thermal response factors) for each unique delayed surface. NECAP uses these response factors to compute each spaces' hourly heat gain/loss.

  16. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  17. Parallel computing solution of Boltzmann neutron transport equation

    International Nuclear Information System (INIS)

    Ansah-Narh, T.

    2010-01-01

    The focus of the research was on developing parallel computing algorithm for solving Eigen-values of the Boltzmam Neutron Transport Equation (BNTE) in a slab geometry using multi-grid approach. In response to the problem of slow execution of serial computing when solving large problems, such as BNTE, the study was focused on the design of parallel computing systems which was an evolution of serial computing that used multiple processing elements simultaneously to solve complex physical and mathematical problems. Finite element method (FEM) was used for the spatial discretization scheme, while angular discretization was accomplished by expanding the angular dependence in terms of Legendre polynomials. The eigenvalues representing the multiplication factors in the BNTE were determined by the power method. MATLAB Compiler Version 4.1 (R2009a) was used to compile the MATLAB codes of BNTE. The implemented parallel algorithms were enabled with matlabpool, a Parallel Computing Toolbox function. The option UseParallel was set to 'always' and the default value of the option was 'never'. When those conditions held, the solvers computed estimated gradients in parallel. The parallel computing system was used to handle all the bottlenecks in the matrix generated from the finite element scheme and each domain of the power method generated. The parallel algorithm was implemented on a Symmetric Multi Processor (SMP) cluster machine, which had Intel 32 bit quad-core x 86 processors. Convergence rates and timings for the algorithm on the SMP cluster machine were obtained. Numerical experiments indicated the designed parallel algorithm could reach perfect speedup and had good stability and scalability. (au)

  18. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  19. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  20. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy

    OpenAIRE

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, Mar?a J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2016-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Se...