WorldWideScience

Sample records for fct functional computed

  1. A systematic review of Functional Communication Training (FCT) interventions involving augmentative and alternative communication in school settings.

    Science.gov (United States)

    Walker, Virginia L; Lyon, Kristin J; Loman, Sheldon L; Sennott, Samuel

    2018-06-01

    The purpose of this meta-analysis was to summarize single-case intervention studies in which Functional Communication Training (FCT) involving augmentative and alternative communication (AAC) was implemented in school settings. Overall, the findings suggest that FCT involving AAC was effective in reducing challenging behaviour and promoting aided or unaided AAC use among participants with disability. FCT was more effective for the participants who engaged in less severe forms of challenging behaviour prior to intervention. Additionally, FCT was more effective when informed by a descriptive functional behaviour assessment and delivered within inclusive school settings. Implications for practice and directions for future research related to FCT for students who use AAC are addressed.

  2. Numerical computation of FCT equilibria by inverse equilibrium method

    International Nuclear Information System (INIS)

    Tokuda, Shinji; Tsunematsu, Toshihide; Takeda, Tatsuoki

    1986-11-01

    FCT (Flux Conserving Tokamak) equilibria were obtained numerically by the inverse equilibrium method. The high-beta tokamak ordering was used to get the explicit boundary conditions for FCT equilibria. The partial differential equation was reduced to the simultaneous quasi-linear ordinary differential equations by using the moment method. The regularity conditions for solutions at the singular point of the equations can be expressed correctly by this reduction and the problem to be solved becomes a tractable boundary value problem on the quasi-linear ordinary differential equations. This boundary value problem was solved by the method of quasi-linearization, one of the shooting methods. Test calculations show that this method provides high-beta tokamak equilibria with sufficiently high accuracy for MHD stability analysis. (author)

  3. Purification, crystallization and preliminary crystallographic analysis of the minor pilin FctB from Streptococcus pyogenes

    International Nuclear Information System (INIS)

    Linke, Christian; Young, Paul G.; Kang, Hae Joo; Proft, Thomas; Baker, Edward N.

    2010-01-01

    The minor pilin FctB from S. pyogenes strain 90/306S was expressed in E. coli, purified and crystallized. The hexagonal FctB crystals diffracted to 2.9 Å resolution. The minor pilin FctB is an integral part of the pilus assembly expressed by Streptococcus pyogenes. Since it is located at the cell wall, it can be hypothesized that it functions as a cell-wall anchor for the streptococcal pilus. In order to elucidate its structure, the genes for FctB from the S. pyogenes strains 90/306S and SF370 were cloned for overexpression in Escherichia coli. FctB from strain 90/306S was crystallized by the sitting-drop vapour-diffusion method using sodium citrate as a precipitant. The hexagonal FctB crystals belonged to space group P6 1 or P6 5 , with unit-cell parameters a = b = 95.15, c = 100.25 Å, and diffracted to 2.9 Å resolution

  4. FCT (functional computed tomography) evaluation of the lung volumes at different PEEP (positive-end expiratory pressure) ventilation pattern, in mechanical ventilated patients

    International Nuclear Information System (INIS)

    Papi, M.G.; Di Segni, R.; Mazzetti, G.; Staffa, F.; Conforto, F.; Calimici, R.; Salvi, A.; Matteucci, G.

    2007-01-01

    Purpose To evaluate with FCT (functional computed tomography) total lung volume and fractional lung volumes at different PEEP (positive end expiratory pressure) values in acute mechanically ventilated patients. Methods Nine ICU (intensive care unity) patients (1 lung pneumonia, 2 polytrauma, 2 sepsis, 3 brain surgery, 1 pulmonary embolism); mean age 48 ± 15 years, 6 male, 3 female; GE 16 MDCT scan was performed with acquisition from apex to diaphragma in seven seca at different PEEP values. Raw CT data were analysed by an advantage workstation to obtain volume density masks and histograms of both lungs and each lung and these density ranges were applied: - 1000 - 950 hyper-ventilated lung, -900 - 650 well aerated lung, -950 - 500 all aerated lung, -500 + 200 lung tissue. Total and fractional lung volumes, Hounsfield unit (HU) were calculated and compared at different PEEP values (0, 5, 10, 15 cm H 2 O). In four patients lung volumes were compared between the more and the less involved lung at increased PEEP. Statistic analysis: comparison means-medians tests. Results Data calculated at five PEEP showed unexpected decrease of total lung volume and increase of lung density (HU); proportionally no significant improvement of oxigenation. (orig.)

  5. FCT (functional computed tomography) evaluation of the lung volumes at different PEEP (positive-end expiratory pressure) ventilation pattern, in mechanical ventilated patients

    Energy Technology Data Exchange (ETDEWEB)

    Papi, M.G.; Di Segni, R.; Mazzetti, G.; Staffa, F. [Dept. of Radiology, S. Giovanni HS, Rome (Italy); Conforto, F.; Calimici, R.; Salvi, A. [Dept. of Anesthesiology, S. Giovanni HS, Rome (Italy); Matteucci, G. [Dept. of Pneumology, S. Giovanni HS, Rome (Italy)

    2007-06-15

    Purpose To evaluate with FCT (functional computed tomography) total lung volume and fractional lung volumes at different PEEP (positive end expiratory pressure) values in acute mechanically ventilated patients. Methods Nine ICU (intensive care unity) patients (1 lung pneumonia, 2 polytrauma, 2 sepsis, 3 brain surgery, 1 pulmonary embolism); mean age 48 {+-} 15 years, 6 male, 3 female; GE 16 MDCT scan was performed with acquisition from apex to diaphragma in seven seca at different PEEP values. Raw CT data were analysed by an advantage workstation to obtain volume density masks and histograms of both lungs and each lung and these density ranges were applied: - 1000 - 950 = hyper-ventilated lung, -900 - 650 well aerated lung, -950 - 500 all aerated lung, -500 + 200 lung tissue. Total and fractional lung volumes, Hounsfield unit (HU) were calculated and compared at different PEEP values (0, 5, 10, 15 cm H{sub 2}O). In four patients lung volumes were compared between the more and the less involved lung at increased PEEP. Statistic analysis: comparison means-medians tests. Results Data calculated at five PEEP showed unexpected decrease of total lung volume and increase of lung density (HU); proportionally no significant improvement of oxigenation. (orig.)

  6. Assembly mechanism of FCT region type 1 pili in serotype M6 Streptococcus pyogenes.

    Science.gov (United States)

    Nakata, Masanobu; Kimura, Keiji Richard; Sumitomo, Tomoko; Wada, Satoshi; Sugauchi, Akinari; Oiki, Eiji; Higashino, Miharu; Kreikemeyer, Bernd; Podbielski, Andreas; Okahashi, Nobuo; Hamada, Shigeyuki; Isoda, Ryutaro; Terao, Yutaka; Kawabata, Shigetada

    2011-10-28

    The human pathogen Streptococcus pyogenes produces diverse pili depending on the serotype. We investigated the assembly mechanism of FCT type 1 pili in a serotype M6 strain. The pili were found to be assembled from two precursor proteins, the backbone protein T6 and ancillary protein FctX, and anchored to the cell wall in a manner that requires both a housekeeping sortase enzyme (SrtA) and pilus-associated sortase enzyme (SrtB). SrtB is primarily required for efficient formation of the T6 and FctX complex and subsequent polymerization of T6, whereas proper anchoring of the pili to the cell wall is mainly mediated by SrtA. Because motifs essential for polymerization of pilus backbone proteins in other Gram-positive bacteria are not present in T6, we sought to identify the functional residues involved in this process. Our results showed that T6 encompasses the novel VAKS pilin motif conserved in streptococcal T6 homologues and that the lysine residue (Lys-175) within the motif and cell wall sorting signal of T6 are prerequisites for isopeptide linkage of T6 molecules. Because Lys-175 and the cell wall sorting signal of FctX are indispensable for substantial incorporation of FctX into the T6 pilus shaft, FctX is suggested to be located at the pilus tip, which was also implied by immunogold electron microscopy findings. Thus, the elaborate assembly of FCT type 1 pili is potentially organized by sortase-mediated cross-linking between sorting signals and the amino group of Lys-175 positioned in the VAKS motif of T6, thereby displaying T6 and FctX in a temporospatial manner.

  7. Assessment Report Sandia National Laboratories Fuel Cycle Technologies Quality Assurance Evaluation of FY15 SNL FCT M2 Milestone Deliverables

    International Nuclear Information System (INIS)

    Appel, Gordon John

    2016-01-01

    Sandia National Laboratories (SNL) Fuel Cycle Technologies (FCT) program activities are conducted in accordance with FCT Quality Assurance Program Document (FCT-QAPD) requirements. The FCT-QAPD interfaces with SNL approved Quality Assurance Program Description (SNL-QAPD) as explained in the Sandia National Laboratories QA Program Interface Document for FCT Activities (Interface Document). This plan describes SNL's FY16 assessment of SNL's FY15 FCT M2 milestone deliverable's compliance with program QA requirements, including SNL R&A requirements. The assessment is intended to confirm that SNL's FY15 milestone deliverables contain the appropriate authenticated review documentation and that there is a copy marked with SNL R&A numbers.

  8. Assessment Report Sandia National Laboratories Fuel Cycle Technologies Quality Assurance Evaluation of FY15 SNL FCT M2 Milestone Deliverables

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-05-01

    Sandia National Laboratories (SNL) Fuel Cycle Technologies (FCT) program activities are conducted in accordance with FCT Quality Assurance Program Document (FCT-QAPD) requirements. The FCT-QAPD interfaces with SNL approved Quality Assurance Program Description (SNL-QAPD) as explained in the Sandia National Laboratories QA Program Interface Document for FCT Activities (Interface Document). This plan describes SNL's FY16 assessment of SNL's FY15 FCT M2 milestone deliverable's compliance with program QA requirements, including SNL R&A requirements. The assessment is intended to confirm that SNL's FY15 milestone deliverables contain the appropriate authenticated review documentation and that there is a copy marked with SNL R&A numbers.

  9. FCT: a fully-distributed context-aware trust model for location based service recommendation

    Institute of Scientific and Technical Information of China (English)

    Zhiquan LIU; Jianfeng MA; Zhongyuan JIANG; Yinbin MIAO

    2017-01-01

    With the popularity of location based service (LBS),a vast number of trust medels for LBS recommendation (LBSR) have been proposed.These trust models are centralized in essence,and the trusted third party may collude with malicious service providers or cause the single-point failure problem.This work improves the classic certified reputation (CR) model and proposes a novel fully-distributed context-aware trust (FCT) model for LBSR.Recommendation operations are conducted by service providers directly and the trusted third party is no longer required in our FCT model.Besides,our FCT model also supports the movements of service providers due to its self-certified characteristic.Moreover,for easing the collusion attack and value imbalance attack,we comprehensively consider four kinds of factor weights,namely number,time decay,preference and context weights.Finally,a fully-distributed service recommendation scenario is deployed,and comprehensive experiments and analysis are conducted.The results indicate that our FCT model significantly outperforms the CR model in terms of the robustness against the collusion attack and value imbalance attack,as well as the service recommendation performance in improving the successful trading rates of honest service providers and reducing the risks of trading with malicious service providers.

  10. Magnetoacoustic heating and FCT-equilibria in the belt pinch

    International Nuclear Information System (INIS)

    Erckmann, V.

    1983-02-01

    In the HECTOR belt pinch of high β plasma is produced by magnetic compression in a Tokamak geometry. After compresseion the initial β value can be varied between 0.2 and 0.8. During 5 μs the plasma is further heated by a fast magnetoacoustic wave with a frequency near the first harmonic of the ion cyclotronfrequency. For the first time the β-value of a pinch plasma could be increased further from 0.34 after compression to 0.46 at the end of the rf-heating cycle. By proper selection of the final β-value the region for resonance absorption of the heating wave can be shifted. Strong heating (200 MW) has been observed in the cases, where the resonance region has been located in the center of the plasma. In deuterium discharges an increase in ion temperature is observed during the heating process, whereas the electrons are energetically decoupled, showing no temperature increase. Strong plasma losses are found in the 200 MW range after the rf-heating process. The dominant mechanisms are charge exchange collisions with neutral gas atoms. During rf-heating and the subsequent cooling phase the magnetic flux is frozen due to the high conductivity of the plasma. The observed equilibria could be identified as flux conserving Tokamak (FCT) equilibria. Based on a two-dimensional code the time-evolution of the equilibria has been calculated. The q-profiles are time-independent, with increasing β the magnetic axis of the plasma is shifted towards the outer boundary of the torus, and finally the linear relation between β and βsub(pol), which is characteristic for low-β-equilibria, is no longer valid. Thus for the first time the existence of FCT-equilibria at high β has been demonstrated experimentally together with a qualitative agreement with FCT-theory. (orig./AH) [de

  11. in local and foreign brands of lipsticks in fct, abuja, nigeria 318

    African Journals Online (AJOL)

    userpc

    This study determined toxic heavy metal concentration in Local and Foreign brands of lipsticks sold in FCT ... (10) Local and ten (10) Foreign brands for Lead using flame atomic absorption .... study, Flame Atomic Absorption Spectrometric.

  12. Assessing Mand Topography Preference When Developing a Functional Communication Training Intervention.

    Science.gov (United States)

    Kunnavatana, S Shanun; Wolfe, Katie; Aguilar, Alexandra N

    2018-05-01

    Functional communication training (FCT) is a common function-based behavioral intervention used to decrease problem behavior by teaching an alternative communication response. Therapists often arbitrarily select the topography of the alternative response, which may influence long-term effectiveness of the intervention. Assessing individual mand topography preference may increase treatment effectiveness and promote self-determination in the development of interventions. This study sought to reduce arbitrary selection of FCT mand topography by determining preference during response training and acquisition for two adults with autism who had no functional communication skills. Both participants demonstrated a clear preference for one mand topography during choice probes, and the preferred topography was then reinforced during FCT to reduce problem behavior and increase independent communication. The implications of the results for future research on mand selection during FCT are discussed.

  13. Silicide induced surface defects in FePt nanoparticle fcc-to-fct thermally activated phase transition

    International Nuclear Information System (INIS)

    Chen, Shu; Lee, Stephen L.; André, Pascal

    2016-01-01

    Magnetic nanoparticles (MnPs) are relevant to a wide range of applications including high density information storage and magnetic resonance imaging to name but a few. Among the materials available to prepare MnPs, FePt is attracting growing attention. However, to harvest the strongest magnetic properties of FePt MnPs, a thermal annealing is often required to convert face-centered cubic as synthesized nPs into its tetragonal phase. Rarely addressed are the potential side effects of such treatments on the magnetic properties. In this study, we focus on the impact of silica shells often used in strategies aiming at overcoming MnP coalescence during the thermal annealing. While we show that this shell does prevent sintering, and that fcc-to-fct conversion does occur, we also reveal the formation of silicide, which can prevent the stronger magnetic properties of fct-FePt MnPs from being fully realised. This report therefore sheds lights on poorly investigated and understood interfacial phenomena occurring during the thermal annealing of MnPs and, by doing so, also highlights the benefits of developing new strategies to avoid silicide formation.

  14. Silicide induced surface defects in FePt nanoparticle fcc-to-fct thermally activated phase transition

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Shu; Lee, Stephen L. [School of Physics and Astronomy, SUPA, University of St Andrews, St Andrews KY16 9SS (United Kingdom); André, Pascal, E-mail: pjpandre@riken.jp [School of Physics and Astronomy, SUPA, University of St Andrews, St Andrews KY16 9SS (United Kingdom); RIKEN, Wako 351-0198 (Japan); Department of Physics, CNRS-Ewha International Research Center (CERC), Ewha W. University, Seoul 120-750 (Korea, Republic of)

    2016-11-01

    Magnetic nanoparticles (MnPs) are relevant to a wide range of applications including high density information storage and magnetic resonance imaging to name but a few. Among the materials available to prepare MnPs, FePt is attracting growing attention. However, to harvest the strongest magnetic properties of FePt MnPs, a thermal annealing is often required to convert face-centered cubic as synthesized nPs into its tetragonal phase. Rarely addressed are the potential side effects of such treatments on the magnetic properties. In this study, we focus on the impact of silica shells often used in strategies aiming at overcoming MnP coalescence during the thermal annealing. While we show that this shell does prevent sintering, and that fcc-to-fct conversion does occur, we also reveal the formation of silicide, which can prevent the stronger magnetic properties of fct-FePt MnPs from being fully realised. This report therefore sheds lights on poorly investigated and understood interfacial phenomena occurring during the thermal annealing of MnPs and, by doing so, also highlights the benefits of developing new strategies to avoid silicide formation.

  15. Functional Communication Training: A Contemporary Behavior Analytic Intervention for Problem Behaviors.

    Science.gov (United States)

    Durand, V. Mark; Merges, Eileen

    2001-01-01

    This article describes functional communication training (FCT) with students who have autism. FCT involves teaching alternative communication strategies to replace problem behaviors. The article reviews the conditions under which this intervention is successful and compares the method with other behavioral approaches. It concludes that functional…

  16. On the effectiveness of and preference for punishment and extinction components of function-based interventions.

    Science.gov (United States)

    Hanley, Gregory P; Piazza, Cathleen C; Fisher, Wayne W; Maglieri, Kristen A

    2005-01-01

    The current study describes an assessment sequence that may be used to identify individualized, effective, and preferred interventions for severe problem behavior in lieu of relying on a restricted set of treatment options that are assumed to be in the best interest of consumers. The relative effectiveness of functional communication training (FCT) with and without a punishment component was evaluated with 2 children for whom functional analyses demonstrated behavioral maintenance via social positive reinforcement. The results showed that FCT plus punishment was more effective than FCT in reducing problem behavior. Subsequently, participants' relative preference for each treatment was evaluated in a concurrent-chains arrangement, and both participants demonstrated a dear preference for FCT with punishment. These findings suggest that the treatment-selection process may be guided by person-centered and evidence-based values.

  17. New Method to Synthesize Highly Active and Durable Chemically Ordered fct-PtCo Cathode Catalyst for PEMFCs.

    Science.gov (United States)

    Jung, Won Suk; Popov, Branko N

    2017-07-19

    In the bottom-up synthesis strategy performed in this study, the Co-catalyzed pyrolysis of chelate-complex and activated carbon black at high temperatures triggers the graphitization reaction which introduces Co particles in the N-doped graphitic carbon matrix and immobilizes N-modified active sites for the oxygen reduction reaction (ORR) on the carbon surface. In this study, the Co particles encapsulated within the N-doped graphitic carbon shell diffuse up to the Pt surface under the polymer protective layer and forms a chemically ordered face-centered tetragonal (fct) Pt-Co catalyst PtCo/CCCS catalyst as evidenced by structural and compositional studies. The fct-structured PtCo/CCCS at low-Pt loading (0.1 mg Pt cm -2 ) shows 6% higher power density than that of the state-of-the-art commercial Pt/C catalyst. After the MEA durability test of 30 000 potential cycles, the performance loss of the catalyst is negligible. The electrochemical surface area loss is less than 40%, while that of commercial Pt/C is nearly 80%. After the accelerated stress test, the uniform catalyst distribution is retained and the mean particle size increases approximate 1 nm. The results obtained in this study indicated that highly stable compositional and structural properties of chemically ordered PtCo/CCCS catalyst contribute to its exceptional catalyst durability.

  18. Solitary pulmonary nodules: impact of functional CT on the cost-effectiveness of FDG-PET

    International Nuclear Information System (INIS)

    Miles, K.A.; Keith, C.J.; Wong, D.C.; Griffiths, M.R.

    2002-01-01

    Full text: FDG-PET has been shown to be cost-effective for the evaluation of solitary pulmonary nodules (SPNs) in Australia. This study evaluates the impact on cost-effectiveness produced by incorporating a novel CT technique, functional CT, into diagnostic algorithms for characterisation of SPNs. Four diagnostic strategies were evaluated using decision tree sensitivity analysis. The first strategy comprised patients undergoing conventional CT alone (CT). The second comprised conventional CT followed by functional CT study (FCT), when the SPN was not benign on conventional CT. The third strategy comprised conventional CT, which if positive is followed by FDG-PET (PET) and a fourth strategy where patients with a positive conventional CT undergo functional CT, which if positive also undergo FDG-PET (FCT+PET). Values for disease prevalence and diagnostic accuracy of PET, CT and functional CT were obtained from a literature review, using Australia values where available. Procedure costs were derived from the Medicare Benefits Schedule and DRG Cost Weights for Australian public hospitals. The cost per patient, accuracy and Incremental Cost-Accuracy Ratio (ICAR) were determined for each strategy. Sensitivity analysis evaluated the effect of disease prevalence on cost-effectiveness. Results: At the prevalence of malignancy reported from Australian series (54%), the FCT strategy incurs the least cost ($5560/patient), followed by the FCT+PET ($5910/patient). The FCT+PET strategy is the most cost-effective strategy with an ICAR of $12059/patient, followed by the PET strategy with an ICAR of $12300/patient. At levels of disease prevalence below 54% the above relationship for cost-effectiveness remains the same. For high levels of disease prevalence, CT or FCT are found to be more cost-effective. At typical prevalence of malignancy the cost-effectiveness of PET is enhanced by the addition of functional CT, but at high prevalence functional CT alone is most cost

  19. Trial-Based Functional Analysis and Functional Communication Training in an Early Childhood Setting

    Science.gov (United States)

    Lambert, Joseph M.; Bloom, Sarah E.; Irvin, Jennifer

    2012-01-01

    Problem behavior is common in early childhood special education classrooms. Functional communication training (FCT; Carr & Durand, 1985) may reduce problem behavior but requires identification of its function. The trial-based functional analysis (FA) is a method that can be used to identify problem behavior function in schools. We conducted…

  20. Resolution function in deep inelastic neutron scattering using the Foil Cycling Technique

    International Nuclear Information System (INIS)

    Pietropaolo, A.; Andreani, C.; Filabozzi, A.; Pace, E.; Senesi, R.

    2007-01-01

    New perspectives for epithermal neutron spectroscopy are being opened up by the development of the Resonance Detector (RD) and its use on inverse geometry time of flight (TOF) spectrometers at spallation sources. The most recent result is the Foil Cycling Technique (FCT), which has been developed and applied on the VESUVIO spectrometer operating in the RD configuration. This technique has demonstrated its capability to improve the resolution function of the spectrometer and to provide an effective neutron and gamma background subtraction method. This paper reports a detailed analysis of the line shape of the resolution function in Deep Inelastic Neutron Scattering (DINS) measurements on VESUVIO spectrometer, operating in the RD configuration and employing the FCT. The aim is to provide an analytical approximation for the analyzer energy transfer function, an useful tool for data analysis on VESUVIO. Simulated and experimental results of DINS measurements on a lead sample are compared. The line shape analysis shows that the most reliable analytical approximation of the energy transfer function is a sum of a Gaussian and a power of a Lorentzian. A comparison with the Double Difference Method (DDM) is also discussed. It is shown that the energy resolution improvement for the FCT and the DDM is almost the same, while the counting efficiency is a factor of about 1.4 higher for the FCT

  1. 18 December 2012 -Portuguese President of FCT M. Seabra visiting the Computing Centre with IT Department Head F. Hemmer, ATLAS experimental area with Collaboration Spokesperson F. Gianotti and A. Henriques Correia, in the LHC tunnel at Point 2 and CMS experimental area with Deputy Spokesperson J. Varela, signing an administrative agreement with Director-General R. Heuer; LIP President J. M. Gago and Delegate to CERN Council G. Barreia present.

    CERN Multimedia

    Samuel Morier-Genoud

    2012-01-01

    18 December 2012 -Portuguese President of FCT M. Seabra visiting the Computing Centre with IT Department Head F. Hemmer, ATLAS experimental area with Collaboration Spokesperson F. Gianotti and A. Henriques Correia, in the LHC tunnel at Point 2 and CMS experimental area with Deputy Spokesperson J. Varela, signing an administrative agreement with Director-General R. Heuer; LIP President J. M. Gago and Delegate to CERN Council G. Barreia present.

  2. Discrete-Trial Functional Analysis and Functional Communication Training with Three Individuals with Autism and Severe Problem Behavior

    Science.gov (United States)

    Schmidt, Jonathan D.; Drasgow, Erik; Halle, James W.; Martin, Christian A.; Bliss, Sacha A.

    2014-01-01

    Discrete-trial functional analysis (DTFA) is an experimental method for determining the variables maintaining problem behavior in the context of natural routines. Functional communication training (FCT) is an effective method for replacing problem behavior, once identified, with a functionally equivalent response. We implemented these procedures…

  3. Evaluating the Treatment Fidelity of Parents Who Conduct In-Home Functional Communication Training with Coaching via Telehealth

    Science.gov (United States)

    Suess, Alyssa N.; Romani, Patrick W.; Wacker, David P.; Dyson, Shannon M.; Kuhle, Jennifer L.; Lee, John F.; Lindgren, Scott D.; Kopelman, Todd G.; Pelzel, Kelly E.; Waldron, Debra B.

    2014-01-01

    We conducted a retrospective, descriptive evaluation of the fidelity with which parents of three children with autism spectrum disorders conducted functional communication training (FCT) in their homes. All training was provided to the parents via telehealth by a behavior consultant in a tertiary-level hospital setting. FCT trials coached by the…

  4. Functional Communication Training

    Science.gov (United States)

    Durand, V. Mark; Moskowitz, Lauren

    2015-01-01

    Thirty years ago, the first experimental demonstration was published showing that educators could improve significant challenging behavior in children with disabilities by replacing these behaviors with forms of communication that served the same purpose, a procedure called functional communication training (FCT). Since the publication of that…

  5. An analysis of functional communication training as an empirically supported treatment for problem behavior displayed by individuals with intellectual disabilities.

    Science.gov (United States)

    Kurtz, Patricia F; Boelter, Eric W; Jarmolowicz, David P; Chin, Michelle D; Hagopian, Louis P

    2011-01-01

    This paper examines the literature on the use of functional communication training (FCT) as a treatment for problem behavior displayed by individuals with intellectual disabilities (ID). Criteria for empirically supported treatments developed by Divisions 12 and 16 of the American Psychological Association (Kratochwill & Stoiber, 2002; Task Force, 1995) and adapted by Jennett and Hagopian (2008) for evaluation of single-case research studies were used to examine the support for FCT. Results indicated that FCT far exceeds criteria to be designated as a well-established treatment for problem behavior exhibited by children with ID and children with autism spectrum disorder, and can be characterized as probably efficacious with adults. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Flux-corrected transport principles, algorithms, and applications

    CERN Document Server

    Löhner, Rainald; Turek, Stefan

    2012-01-01

    Many modern high-resolution schemes for Computational Fluid Dynamics trace their origins to the Flux-Corrected Transport (FCT) paradigm. FCT maintains monotonicity using a nonoscillatory low-order scheme to determine the bounds for a constrained high-order approximation. This book begins with historical notes by J.P. Boris and D.L. Book who invented FCT in the early 1970s. The chapters that follow describe the design of fully multidimensional FCT algorithms for structured and unstructured grids, limiting for systems of conservation laws, and the use of FCT as an implicit subgrid scale model. The second edition presents 200 pages of additional material. The main highlights of the three new chapters include: FCT-constrained interpolation for Arbitrary Lagrangian-Eulerian methods, an optimization-based approach to flux correction, and FCT simulations of high-speed flows on overset grids. Addressing students and researchers, as well as CFD practitioners, the book is focused on computational aspects and contains m...

  7. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    Energy Technology Data Exchange (ETDEWEB)

    Soria-Hoyo, C; Castellanos, A [Departamento de Electronica y Electromagnetismo, Facultad de Fisica, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain); Pontiga, F [Departamento de Fisica Aplicada II, EUAT, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: cshoyo@us.es

    2008-10-21

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  8. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    International Nuclear Information System (INIS)

    Soria-Hoyo, C; Castellanos, A; Pontiga, F

    2008-01-01

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  9. HCP to FCT + precipitate transformations in lamellar gamma-titanium aluminide alloys

    Science.gov (United States)

    Karadge, Mallikarjun Baburao

    Fully lamellar gamma-TiAl [alpha2(HCP) + gamma(FCT)] based alloys are potential structural materials for aerospace engine applications. Lamellar structure stabilization and additional strengthening mechanisms are major issues in the ongoing development of titanium aluminides due to the microstructural instability resulting from decomposition of the strengthening alpha 2 phase. This work addresses characterization of multi-component TiAl systems to identify the mechanism of lamellar structure refinement and assess the effects of light element additions (C and Si) on creep deformation behavior. Transmission electron microscopy studies directly confirmed for the first time that, fine lamellar structure is formed by the nucleation and growth of a large number of basal stacking faults on the 1/6 dislocations cross slipping repeatedly into and out of basal planes. This lamellar structure can be tailored by modifying jog heights through chemistry and thermal processing. alpha 2 → gamma transformation during heating (investigated by differential scanning calorimetry and X-ray diffraction) is a two step process involving the formation of a novel disordered FCC gamma' TiAl [with a(gamma') = c(gamma)] as an intermediate phase followed by ordering. Addition of carbon and silicon induced Ti2AlC H-type carbide precipitation inside the alpha2 lath and Ti 5(Al,Si)3 zeta-type silicide precipitation at the alpha 2/gamma interface. The H-carbides preserve alpha2/gamma type interfaces, while zeta-silicide precipitates restrict ledge growth and interfacial sliding enabling strong resistance to creep deformation.

  10. On the comparison of different multiphase flow kernels for gas pipeline real time advanced functions

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins; Barbosa Figueiredo, Aline; Bodstein, Gustavo C. R. [Federal University of Rio de Janeiro - UFRJ, Rio de Janeiro, (Brazil)

    2010-07-01

    Two-fluid models identify and treat phases independently. These models could be useful for developing high performance tools for leak detection, location and quantification. This paper reports the development of a simplified two-fluid model called SPM-4. Different computer methods were tested (Richtmyer, Force, FCT, TVD/LAX, Rusanov), from first orde centered schemes up to second order characteristics-based schemes. A theoretical scenario was created based on on-field data. A typical two-phase gas pipeline was defined as a test scenario for comparing the simplified two-phase flow simulator based on SPM-4 with the latest version of OLGA, a commercial computerized flow simulator. Also the different selected computer methods for SPM-4 were compared to each other and with the OLGA 2PM-6 models as reference. The final results showed the Richtmyer and FCT are the most consistent methods in terms of accuracy and CPU performance when compared to the benchmark 2PM-6.

  11. Computational Methods and Function Theory

    CERN Document Server

    Saff, Edward; Salinas, Luis; Varga, Richard

    1990-01-01

    The volume is devoted to the interaction of modern scientific computation and classical function theory. Many problems in pure and more applied function theory can be tackled using modern computing facilities: numerically as well as in the sense of computer algebra. On the other hand, computer algorithms are often based on complex function theory, and dedicated research on their theoretical foundations can lead to great enhancements in performance. The contributions - original research articles, a survey and a collection of problems - cover a broad range of such problems.

  12. Software For Computing Selected Functions

    Science.gov (United States)

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  13. 76 FR 65197 - Statement of Organization, Functions, and Delegations of Authority

    Science.gov (United States)

    2011-10-20

    ... Information and Insurance Oversight (FCR) Office of Public Engagement (FCS) Office of Communications (FCT... for Medicaid and CHIP Services (CMCS), and (2) realign the governmental relations function from the.... In conjunction with the Office of Public Engagement, oversees all CMS interactions and collaboration...

  14. Functional programming for computer vision

    Science.gov (United States)

    Breuel, Thomas M.

    1992-04-01

    Functional programming is a style of programming that avoids the use of side effects (like assignment) and uses functions as first class data objects. Compared with imperative programs, functional programs can be parallelized better, and provide better encapsulation, type checking, and abstractions. This is important for building and integrating large vision software systems. In the past, efficiency has been an obstacle to the application of functional programming techniques in computationally intensive areas such as computer vision. We discuss and evaluate several 'functional' data structures for representing efficiently data structures and objects common in computer vision. In particular, we will address: automatic storage allocation and reclamation issues; abstraction of control structures; efficient sequential update of large data structures; representing images as functions; and object-oriented programming. Our experience suggests that functional techniques are feasible for high- performance vision systems, and that a functional approach simplifies the implementation and integration of vision systems greatly. Examples in C++ and SML are given.

  15. Normal Functions As A New Way Of Defining Computable Functions

    Directory of Open Access Journals (Sweden)

    Leszek Dubiel

    2004-01-01

    Full Text Available Report sets new method of defining computable functions. This is formalization of traditional function descriptions, so it allows to define functions in very intuitive way. Discovery of Ackermann function proved that not all functions that can be easily computed can be so easily described with Hilbert’s system of recursive functions. Normal functions lack this disadvantage.

  16. Normal Functions as a New Way of Defining Computable Functions

    Directory of Open Access Journals (Sweden)

    Leszek Dubiel

    2004-01-01

    Full Text Available Report sets new method of defining computable functions. This is formalization of traditional function descriptions, so it allows to define functions in very intuitive way. Discovery of Ackermann function proved that not all functions that can be easily computed can be so easily described with Hilbert's system of recursive functions. Normal functions lack this disadvantage.

  17. Indirect Effects of Functional Communication Training on Non-Targeted Disruptive Behavior

    Science.gov (United States)

    Schieltz, Kelly M.; Wacker, David P.; Harding, Jay W.; Berg, Wendy K.; Lee, John F.; Padilla Dalmau, Yaniz C.; Mews, Jayme; Ibrahimovic, Muska

    2011-01-01

    The purpose of this study was to evaluate the effects of functional communication training (FCT) on the occurrence of non-targeted disruptive behavior. The 10 participants were preschool-aged children with developmental disabilities who engaged in both destructive (property destruction, aggression, self-injury) and disruptive (hand flapping,…

  18. Using multiple schedules during functional communication training to promote rapid transfer of treatment effects.

    Science.gov (United States)

    Fisher, Wayne W; Greer, Brian D; Fuhrman, Ashley M; Querim, Angie C

    2015-12-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and therapists. With 2 children, we conducted FCT in the context of mixed (baseline) and multiple (treatment) schedules introduced across settings or therapists using a multiple baseline design. Results indicated that when the multiple schedules were introduced, the functional communication response came under rapid discriminative control, and problem behavior remained at near-zero rates. We extended these findings with another individual by using a more traditional baseline in which problem behavior produced reinforcement. Results replicated those of the previous participants and showed rapid reductions in problem behavior when multiple schedules were implemented across settings. © Society for the Experimental Analysis of Behavior.

  19. Structure of BRS-invariant local functionals

    International Nuclear Information System (INIS)

    Brandt, F.

    1993-01-01

    For a large class of gauge theories a nilpotent BRS-operator s is constructed and its cohomology in the space of local functionals of the off-shell fields is shown to be isomorphic to the cohomology of s=s+d on functions f(C,T) of tensor fields T and of variables C which are constructed of the ghosts and the connection forms. The result allows general statements about the structure of invariant classical actions and anomaly cadidates whose BRS-variation vanishes off-shell. The assumptions under which the result holds are thoroughly discussed. (orig.)

  20. Using Multiple Schedules during Functional Communication Training to Promote Rapid Transfer of Treatment Effects

    Science.gov (United States)

    Fisher, Wayne W.; Greer, Brian D.; Fuhrman, Ashley M.; Querim, Angie C.

    2015-01-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and…

  1. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  2. Deterministic computation of functional integrals

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.

    1995-09-01

    A new method of numerical integration in functional spaces is described. This method is based on the rigorous definition of a functional integral in complete separable metric space and on the use of approximation formulas which we constructed for this kind of integral. The method is applicable to solution of some partial differential equations and to calculation of various characteristics in quantum physics. No preliminary discretization of space and time is required in this method, as well as no simplifying assumptions like semi-classical, mean field approximations, collective excitations, introduction of ''short-time'' propagators, etc are necessary in our approach. The constructed approximation formulas satisfy the condition of being exact on a given class of functionals, namely polynomial functionals of a given degree. The employment of these formulas replaces the evaluation of a functional integral by computation of the ''ordinary'' (Riemannian) integral of a low dimension, thus allowing to use the more preferable deterministic algorithms (normally - Gaussian quadratures) in computations rather than traditional stochastic (Monte Carlo) methods which are commonly used for solution of the problem under consideration. The results of application of the method to computation of the Green function of the Schroedinger equation in imaginary time as well as the study of some models of Euclidean quantum mechanics are presented. The comparison with results of other authors shows that our method gives significant (by an order of magnitude) economy of computer time and memory versus other known methods while providing the results with the same or better accuracy. The funcitonal measure of the Gaussian type is considered and some of its particular cases, namely conditional Wiener measure in quantum statistical mechanics and functional measure in a Schwartz distribution space in two-dimensional quantum field theory are studied in detail. Numerical examples demonstrating the

  3. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  4. Functional requirements for gas characterization system computer software

    International Nuclear Information System (INIS)

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  5. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  6. The Long-Term Effects of Functional Communication Training Conducted in Young Children's Home Settings

    Science.gov (United States)

    Wacker, David P.; Schieltz, Kelly M.; Berg, Wendy K.; Harding, Jay W.; Padilla Dalmau, Yaniz C.; Lee, John F.

    2017-01-01

    This article describes the results of a series of studies that involved functional communication training (FCT) conducted in children's homes by their parents. The 103 children who participated were six years old or younger, had developmental delays, and engaged in destructive behaviors such as self-injury. The core procedures used in each study…

  7. RATGRAPH: Computer Graphing of Rational Functions.

    Science.gov (United States)

    Minch, Bradley A.

    1987-01-01

    Presents an easy-to-use Applesoft BASIC program that graphs rational functions and any asymptotes that the functions might have. Discusses the nature of rational functions, graphing them manually, employing a computer to graph rational functions, and describes how the program works. (TW)

  8. A summary of numerical computation for special functions

    International Nuclear Information System (INIS)

    Zhang Shanjie

    1992-01-01

    In the paper, special functions frequently encountered in science and engineering calculations are introduced. The computation of the values of Bessel function and elliptic integrals are taken as the examples, and some common algorithms for computing most special functions, such as series expansion for small argument, asymptotic approximations for large argument, polynomial approximations, recurrence formulas and iteration method, are discussed. In addition, the determination of zeros of some special functions, and the other questions related to numerical computation are also discussed

  9. Computing the zeros of analytic functions

    CERN Document Server

    Kravanja, Peter

    2000-01-01

    Computing all the zeros of an analytic function and their respective multiplicities, locating clusters of zeros and analytic fuctions, computing zeros and poles of meromorphic functions, and solving systems of analytic equations are problems in computational complex analysis that lead to a rich blend of mathematics and numerical analysis. This book treats these four problems in a unified way. It contains not only theoretical results (based on formal orthogonal polynomials or rational interpolation) but also numerical analysis and algorithmic aspects, implementation heuristics, and polished software (the package ZEAL) that is available via the CPC Program Library. Graduate studets and researchers in numerical mathematics will find this book very readable.

  10. Dynamics and computation in functional shifts

    Science.gov (United States)

    Namikawa, Jun; Hashimoto, Takashi

    2004-07-01

    We introduce a new type of shift dynamics as an extended model of symbolic dynamics, and investigate the characteristics of shift spaces from the viewpoints of both dynamics and computation. This shift dynamics is called a functional shift, which is defined by a set of bi-infinite sequences of some functions on a set of symbols. To analyse the complexity of functional shifts, we measure them in terms of topological entropy, and locate their languages in the Chomsky hierarchy. Through this study, we argue that considering functional shifts from the viewpoints of both dynamics and computation gives us opposite results about the complexity of systems. We also describe a new class of shift spaces whose languages are not recursively enumerable.

  11. BLUES function method in computational physics

    Science.gov (United States)

    Indekeu, Joseph O.; Müller-Nedebock, Kristian K.

    2018-04-01

    We introduce a computational method in physics that goes ‘beyond linear use of equation superposition’ (BLUES). A BLUES function is defined as a solution of a nonlinear differential equation (DE) with a delta source that is at the same time a Green’s function for a related linear DE. For an arbitrary source, the BLUES function can be used to construct an exact solution to the nonlinear DE with a different, but related source. Alternatively, the BLUES function can be used to construct an approximate piecewise analytical solution to the nonlinear DE with an arbitrary source. For this alternative use the related linear DE need not be known. The method is illustrated in a few examples using analytical calculations and numerical computations. Areas for further applications are suggested.

  12. Flux-corrected transport principles, algorithms, and applications

    CERN Document Server

    Kuzmin, Dmitri; Turek, Stefan

    2005-01-01

    Addressing students and researchers as well as CFD practitioners, this book describes the state of the art in the development of high-resolution schemes based on the Flux-Corrected Transport (FCT) paradigm. Intended for readers who have a solid background in Computational Fluid Dynamics, the book begins with historical notes by J.P. Boris and D.L. Book. Review articles that follow describe recent advances in the design of FCT algorithms as well as various algorithmic aspects. The topics addressed in the book and its main highlights include: the derivation and analysis of classical FCT schemes with special emphasis on the underlying physical and mathematical constraints; flux limiting for hyperbolic systems; generalization of FCT to implicit time-stepping and finite element discretizations on unstructured meshes and its role as a subgrid scale model for Monotonically Integrated Large Eddy Simulation (MILES) of turbulent flows. The proposed enhancements of the FCT methodology also comprise the prelimiting and '...

  13. On The Effectiveness Of And Preference For Punishment And Extinction Components Of Function-Based Interventions

    OpenAIRE

    Hanley, Gregory P; Piazza, Cathleen C; Fisher, Wayne W; Maglieri, Kristen A

    2005-01-01

    The current study describes an assessment sequence that may be used to identify individualized, effective, and preferred interventions for severe problem behavior in lieu of relying on a restricted set of treatment options that are assumed to be in the best interest of consumers. The relative effectiveness of functional communication training (FCT) with and without a punishment component was evaluated with 2 children for whom functional analyses demonstrated behavioral maintenance via social ...

  14. Accurate computation of Mathieu functions

    CERN Document Server

    Bibby, Malcolm M

    2013-01-01

    This lecture presents a modern approach for the computation of Mathieu functions. These functions find application in boundary value analysis such as electromagnetic scattering from elliptic cylinders and flat strips, as well as the analogous acoustic and optical problems, and many other applications in science and engineering. The authors review the traditional approach used for these functions, show its limitations, and provide an alternative ""tuned"" approach enabling improved accuracy and convergence. The performance of this approach is investigated for a wide range of parameters and mach

  15. Single-Case Analysis to Determine Reasons for Failure of Behavioral Treatment via Telehealth

    Science.gov (United States)

    Schieltz, Kelly M.; Romani, Patrick W.; Wacker, David P.; Suess, Alyssa N.; Huang, Pei; Berg, Wendy K.; Lindgren, Scott D.; Kopelman, Todd G.

    2018-01-01

    Functional communication training (FCT) is a widely used and effective function-based treatment for problem behavior. The purpose of this article is to present two cases in which FCT was unsuccessful in reducing the occurrence of problem behavior displayed by two young children with an autism spectrum disorder. Both children received the same…

  16. Accurate and efficient computation of synchrotron radiation functions

    International Nuclear Information System (INIS)

    MacLeod, Allan J.

    2000-01-01

    We consider the computation of three functions which appear in the theory of synchrotron radiation. These are F(x)=x∫x∞K 5/3 (y) dy))F p (x)=xK 2/3 (x) and G p (x)=x 1/3 K 1/3 (x), where K ν denotes a modified Bessel function. Chebyshev series coefficients are given which enable the functions to be computed with an accuracy of up to 15 sig. figures

  17. Anxiety, Family Functioning and Neuroendocrine Biomarkers in Obese Children

    OpenAIRE

    Inês Pinto; Simon Wilkinson; Daniel Virella; Marta Alves; Conceição Calhau; Rui Coelho

    2017-01-01

    The project was supported by the Research Support Scheme of the FMUP/doctoral program, grant no PEst-OE/SAU/UI0038/2011 and by FCT, SFRH/SINTD/60115/2009, FSE-UE. Introduction: This observational study explores potential links between obese children’s cortisol, and parental mental state, family functioning, and the children’s symptoms of anxiety and depression. Material and Methods: A non-random sample of 104 obese children (55 boys), mean age 10.9 years (standard deviation 1.76), was recr...

  18. Discrete Wigner functions and quantum computation

    International Nuclear Information System (INIS)

    Galvao, E.

    2005-01-01

    Full text: Gibbons et al. have recently defined a class of discrete Wigner functions W to represent quantum states in a finite Hilbert space dimension d. I characterize the set C d of states having non-negative W simultaneously in all definitions of W in this class. I then argue that states in this set behave classically in a well-defined computational sense. I show that one-qubit states in C 2 do not provide for universal computation in a recent model proposed by Bravyi and Kitaev [quant-ph/0403025]. More generally, I show that the only pure states in C d are stabilizer states, which have an efficient description using the stabilizer formalism. This result shows that two different notions of 'classical' states coincide: states with non-negative Wigner functions are those which have an efficient description. This suggests that negativity of W may be necessary for exponential speed-up in pure-state quantum computation. (author)

  19. Computation of hyperspherical Bessel functions

    OpenAIRE

    Tram, Thomas

    2013-01-01

    In this paper we present a fast and accurate numerical algorithm for the computation of hyperspherical Bessel functions of large order and real arguments. For the hyperspherical Bessel functions of closed type, no stable algorithm existed so far due to the lack of a backwards recurrence. We solved this problem by establishing a relation to Gegenbauer polynomials. All our algorithms are written in C and are publicly available at Github [https://github.com/lesgourg/class_public]. A Python wrapp...

  20. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Science.gov (United States)

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  1. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  2. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps.

    Science.gov (United States)

    Plasser, Felix; Ruckenbauer, Matthias; Mai, Sebastian; Oppel, Markus; Marquetand, Philipp; González, Leticia

    2016-03-08

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented.

  3. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  4. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han; Yang, Yong Liang; Bao, Fan; Fink, Daniel; Yan, Dongming; Wonka, Peter; Mitra, Niloy J.

    2016-01-01

    of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications

  5. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  6. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed; Meier, Stuart Kurt

    2013-01-01

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  7. Inferring biological functions of guanylyl cyclases with computational methods

    KAUST Repository

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  8. Computational complexity of Boolean functions

    Energy Technology Data Exchange (ETDEWEB)

    Korshunov, Aleksei D [Sobolev Institute of Mathematics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk (Russian Federation)

    2012-02-28

    Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.

  9. Function Package for Computing Quantum Resource Measures

    Science.gov (United States)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  10. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Science.gov (United States)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  11. Geometric optical transfer function and tis computation method

    International Nuclear Information System (INIS)

    Wang Qi

    1992-01-01

    Geometric Optical Transfer Function formula is derived after expound some content to be easily ignored, and the computation method is given with Bessel function of order zero and numerical integration and Spline interpolation. The method is of advantage to ensure accuracy and to save calculation

  12. Computer Games Functioning as Motivation Stimulants

    Science.gov (United States)

    Lin, Grace Hui Chin; Tsai, Tony Kung Wan; Chien, Paul Shih Chieh

    2011-01-01

    Numerous scholars have recommended computer games can function as influential motivation stimulants of English learning, showing benefits as learning tools (Clarke and Dede, 2007; Dede, 2009; Klopfer and Squire, 2009; Liu and Chu, 2010; Mitchell, Dede & Dunleavy, 2009). This study aimed to further test and verify the above suggestion,…

  13. Computer-controlled mechanical lung model for application in pulmonary function studies

    NARCIS (Netherlands)

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  14. Special software for computing the special functions of wave catastrophes

    Directory of Open Access Journals (Sweden)

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  15. A large-scale evaluation of computational protein function prediction

    NARCIS (Netherlands)

    Radivojac, P.; Clark, W.T.; Oron, T.R.; Schnoes, A.M.; Wittkop, T.; Kourmpetis, Y.A.I.; Dijk, van A.D.J.; Friedberg, I.

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be

  16. Computational design of proteins with novel structure and functions

    International Nuclear Information System (INIS)

    Yang Wei; Lai Lu-Hua

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence–structure–function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein–protein interactions. Challenges and future prospects of this field are also discussed. (topical review)

  17. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  18. Computer network defense through radial wave functions

    Science.gov (United States)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  19. FCJ-131 Pervasive Computing and Prosopopoietic Modelling – Notes on computed function and creative action

    Directory of Open Access Journals (Sweden)

    Anders Michelsen

    2011-12-01

    Full Text Available This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997 terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic

  20. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  1. Positive Wigner functions render classical simulation of quantum computation efficient.

    Science.gov (United States)

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  2. A computer program for the pointwise functions generation

    International Nuclear Information System (INIS)

    Caldeira, Alexandre D.

    1995-01-01

    A computer program that was developed with the objective of generating pointwise functions, by a combination of tabulated values and/or mathematical expressions, to be used as weighting functions for nuclear data is presented. This simple program can be an important tool for researchers involved in group constants generation. (author). 5 refs, 2 figs

  3. On computing special functions in marine engineering

    Science.gov (United States)

    Constantinescu, E.; Bogdan, M.

    2015-11-01

    Important modeling applications in marine engineering conduct us to a special class of solutions for difficult differential equations with variable coefficients. In order to be able to solve and implement such models (in wave theory, in acoustics, in hydrodynamics, in electromagnetic waves, but also in many other engineering fields), it is necessary to compute so called special functions: Bessel functions, modified Bessel functions, spherical Bessel functions, Hankel functions. The aim of this paper is to develop numerical solutions in Matlab for the above mentioned special functions. Taking into account the main properties for Bessel and modified Bessel functions, we shortly present analytically solutions (where possible) in the form of series. Especially it is studied the behavior of these special functions using Matlab facilities: numerical solutions and plotting. Finally, it will be compared the behavior of the special functions and point out other directions for investigating properties of Bessel and spherical Bessel functions. The asymptotic forms of Bessel functions and modified Bessel functions allow determination of important properties of these functions. The modified Bessel functions tend to look more like decaying and growing exponentials.

  4. New Computer Simulations of Macular Neural Functioning

    Science.gov (United States)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  5. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    NARCIS (Netherlands)

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  6. Numerical computation of generalized importance functions

    International Nuclear Information System (INIS)

    Gomit, J.M.; Nasr, M.; Ngyuen van Chi, G.; Pasquet, J.P.; Planchard, J.

    1981-01-01

    Thus far, an important effort has been devoted to developing and applying generalized perturbation theory in reactor physics analysis. In this work we are interested in the calculation of the importance functions by the method of A. Gandini. We have noted that in this method the convergence of the iterative procedure adopted is not rapid. Hence to accelerate this convergence we have used the semi-iterative technique. Two computer codes have been developed for one and two dimensional calculations (SPHINX-1D and SPHINX-2D). The advantage of our calculation was confirmed by some comparative tests in which the iteration number and the computing time were highly reduced with respect to classical calculation (CIAP-1D and CIAP-2D). (orig.) [de

  7. Computing complex Airy functions by numerical quadrature

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2001-01-01

    textabstractIntegral representations are considered of solutions of the Airydifferential equation w''-z, w=0 for computing Airy functions for complex values of z.In a first method contour integral representations of the Airyfunctions are written as non-oscillating

  8. Computation of Galois field expressions for quaternary logic functions on GPUs

    Directory of Open Access Journals (Sweden)

    Gajić Dušan B.

    2014-01-01

    Full Text Available Galois field (GF expressions are polynomials used as representations of multiple-valued logic (MVL functions. For this purpose, MVL functions are considered as functions defined over a finite (Galois field of order p - GF(p. The problem of computing these functional expressions has an important role in areas such as digital signal processing and logic design. Time needed for computing GF-expressions increases exponentially with the number of variables in MVL functions and, as a result, it often represents a limiting factor in applications. This paper proposes a method for an accelerated computation of GF(4-expressions for quaternary (four-valued logic functions using graphics processing units (GPUs. The method is based on the spectral interpretation of GF-expressions, permitting the use of fast Fourier transform (FFT-like algorithms for their computation. These algorithms are then adapted for highly parallel processing on GPUs. The performance of the proposed solutions is compared with referent C/C++ implementations of the same algorithms processed on central processing units (CPUs. Experimental results confirm that the presented approach leads to significant reduction in processing times (up to 10.86 times when compared to CPU processing. Therefore, the proposed approach widens the set of problem instances which can be efficiently handled in practice. [Projekat Ministarstva nauke Republike Srbije, br. ON174026 i br. III44006

  9. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  10. Versatile Density Functionals for Computational Surface Science

    DEFF Research Database (Denmark)

    Wellendorff, Jess

    Density functional theory (DFT) emerged almost 50 years ago. Since then DFT has established itself as the central electronic structure methodology for simulating atomicscale systems from a few atoms to a few hundred atoms. This success of DFT is due to a very favorable accuracy-to-computational c......Density functional theory (DFT) emerged almost 50 years ago. Since then DFT has established itself as the central electronic structure methodology for simulating atomicscale systems from a few atoms to a few hundred atoms. This success of DFT is due to a very favorable accuracy...... resampling techniques, thereby systematically avoiding problems with overfitting. The first ever density functional presenting both reliable accuracy and convincing error estimation is generated. The methodology is general enough to be applied to more complex functional forms with higher-dimensional fitting...

  11. Computing three-point functions for short operators

    International Nuclear Information System (INIS)

    Bargheer, Till; Institute for Advanced Study, Princeton, NJ; Minahan, Joseph A.; Pereira, Raul

    2013-11-01

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  12. Computing three-point functions for short operators

    Energy Technology Data Exchange (ETDEWEB)

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  13. Variance computations for functional of absolute risk estimates.

    Science.gov (United States)

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  14. Computational network design from functional specifications

    KAUST Repository

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  15. Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics

    CERN Document Server

    Ismail, Mourad

    2001-01-01

    These are the proceedings of the conference "Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics" held at the Department of Mathematics, University of Florida, Gainesville, from November 11 to 13, 1999. The main emphasis of the conference was Com­ puter Algebra (i. e. symbolic computation) and how it related to the fields of Number Theory, Special Functions, Physics and Combinatorics. A subject that is common to all of these fields is q-series. We brought together those who do symbolic computation with q-series and those who need q-series in­ cluding workers in Physics and Combinatorics. The goal of the conference was to inform mathematicians and physicists who use q-series of the latest developments in the field of q-series and especially how symbolic computa­ tion has aided these developments. Over 60 people were invited to participate in the conference. We ended up having 45 participants at the conference, including six one hour plenary speakers and 28 half hour speakers. T...

  16. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  17. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    Science.gov (United States)

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…

  18. Computation of Value Functions in Nonlinear Differential Games with State Constraints

    KAUST Repository

    Botkin, Nikolai; Hoffmann, Karl-Heinz; Mayer, Natalie; Turova, Varvara

    2013-01-01

    Finite-difference schemes for the computation of value functions of nonlinear differential games with non-terminal payoff functional and state constraints are proposed. The solution method is based on the fact that the value function is a

  19. Fast computation of complete elliptic integrals and Jacobian elliptic functions

    Science.gov (United States)

    Fukushima, Toshio

    2009-12-01

    As a preparation step to compute Jacobian elliptic functions efficiently, we created a fast method to calculate the complete elliptic integral of the first and second kinds, K( m) and E( m), for the standard domain of the elliptic parameter, 0 procedure to compute simultaneously three Jacobian elliptic functions, sn( u| m), cn( u| m), and dn( u| m), by repeated usage of the double argument formulae starting from the Maclaurin series expansions with respect to the elliptic argument, u, after its domain is reduced to the standard range, 0 ≤ u procedure is 25-70% faster than the methods based on the Gauss transformation such as Bulirsch’s algorithm, sncndn, quoted in the Numerical Recipes even if the acceleration of computation of K( m) is not taken into account.

  20. A hybrid method for the parallel computation of Green's functions

    International Nuclear Information System (INIS)

    Petersen, Dan Erik; Li Song; Stokbro, Kurt; Sorensen, Hans Henrik B.; Hansen, Per Christian; Skelboe, Stig; Darve, Eric

    2009-01-01

    Quantum transport models for nanodevices using the non-equilibrium Green's function method require the repeated calculation of the block tridiagonal part of the Green's and lesser Green's function matrices. This problem is related to the calculation of the inverse of a sparse matrix. Because of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only require computing a small number of entries of the inverse matrix. Then, we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size.

  1. Computational Methods for Large Spatio-temporal Datasets and Functional Data Ranking

    KAUST Repository

    Huang, Huang

    2017-07-16

    This thesis focuses on two topics, computational methods for large spatial datasets and functional data ranking. Both are tackling the challenges of big and high-dimensional data. The first topic is motivated by the prohibitive computational burden in fitting Gaussian process models to large and irregularly spaced spatial datasets. Various approximation methods have been introduced to reduce the computational cost, but many rely on unrealistic assumptions about the process and retaining statistical efficiency remains an issue. We propose a new scheme to approximate the maximum likelihood estimator and the kriging predictor when the exact computation is infeasible. The proposed method provides different types of hierarchical low-rank approximations that are both computationally and statistically efficient. We explore the improvement of the approximation theoretically and investigate the performance by simulations. For real applications, we analyze a soil moisture dataset with 2 million measurements with the hierarchical low-rank approximation and apply the proposed fast kriging to fill gaps for satellite images. The second topic is motivated by rank-based outlier detection methods for functional data. Compared to magnitude outliers, it is more challenging to detect shape outliers as they are often masked among samples. We develop a new notion of functional data depth by taking the integration of a univariate depth function. Having a form of the integrated depth, it shares many desirable features. Furthermore, the novel formation leads to a useful decomposition for detecting both shape and magnitude outliers. Our simulation studies show the proposed outlier detection procedure outperforms competitors in various outlier models. We also illustrate our methodology using real datasets of curves, images, and video frames. Finally, we introduce the functional data ranking technique to spatio-temporal statistics for visualizing and assessing covariance properties, such as

  2. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  3. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  4. Functions of the computer management games

    OpenAIRE

    Kočí, Josef

    2016-01-01

    This thesis discusses the possibilities of using managerial games, their purpose, meaning, functions and focuses specifically on the management computer games, how it differs from classic games and what are their advantages and disadvantages. The theoretical part of thesis is also focused on why are these games discussed, why are they accepted or sometimes rejected and why they have become so popular for some managers and public gamers. This will serve me a survey conducted in the 11 April 20...

  5. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    Science.gov (United States)

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  6. Image reconstruction of computed tomograms using functional algebra

    International Nuclear Information System (INIS)

    Bradaczek, M.; Bradaczek, H.

    1997-01-01

    A detailed presentation of the process for calculating computed tomograms from the measured data by means of functional algebra is given and an attempt is made to demonstrate the relationships to those inexperienced in mathematics. Suggestions are also made to the manufacturers for improving tomography software although the authors cannot exclude the possibility that some of the recommendations may have already been realized. An interpolation in Fourier space to right-angled coordinates was not employed so that additional computer time and errors resulting from the interpolation are avoided. The savings in calculation time can only be estimated but should amount to about 25%. The error-correction calculation is merely a suggestion since it depends considerably on the apparatus used. Functional algebra is introduced here because it is not so well known but does provide appreciable simplifications in comparison to an explicit presentation. Didactic reasons as well as the possibility for reducing calculation time provided the foundation for this work. (orig.) [de

  7. Systemic functional grammar in natural language generation linguistic description and computational representation

    CERN Document Server

    Teich, Elke

    1999-01-01

    This volume deals with the computational application of systemic functional grammar (SFG) for natural language generation. In particular, it describes the implementation of a fragment of the grammar of German in the computational framework of KOMET-PENMAN for multilingual generation. The text also presents a specification of explicit well-formedness constraints on syntagmatic structure which are defined in the form of typed feature structures. It thus achieves a model of systemic functional grammar that unites both the strengths of systemics, such as stratification, functional diversification

  8. Optimized Kaiser-Bessel Window Functions for Computed Tomography.

    Science.gov (United States)

    Nilchian, Masih; Ward, John Paul; Vonesch, Cedric; Unser, Michael

    2015-11-01

    Kaiser-Bessel window functions are frequently used to discretize tomographic problems because they have two desirable properties: 1) their short support leads to a low computational cost and 2) their rotational symmetry makes their imaging transform independent of the direction. In this paper, we aim at optimizing the parameters of these basis functions. We present a formalism based on the theory of approximation and point out the importance of the partition-of-unity condition. While we prove that, for compact-support functions, this condition is incompatible with isotropy, we show that minimizing the deviation from the partition of unity condition is highly beneficial. The numerical results confirm that the proposed tuning of the Kaiser-Bessel window functions yields the best performance.

  9. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    International Nuclear Information System (INIS)

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  10. The Effect and Mechanism of Transdermal Penetration Enhancement of Fu's Cupping Therapy: New Physical Penetration Technology for Transdermal Administration with Traditional Chinese Medicine (TCM) Characteristics.

    Science.gov (United States)

    Xie, Wei-Jie; Zhang, Yong-Ping; Xu, Jian; Sun, Xiao-Bo; Yang, Fang-Fang

    2017-03-27

    the control group, the indomethacin skin percutaneous rate of the FCT low-intensity group (FCTL) was 35.52%, and the enhancement ratio (ER) at 9 h was 1.76X, roughly equivalent to the penetration enhancing effect of the CPEs and iontophoresis. Secondly, the indomethacin percutaneous ratio of the FCT middle-intensity group (FCTM) and FCT high-intensity group (FCTH) were 47.36% and 54.58%, respectively, while the ERs at 9 h were 3.58X and 8.39X, respectively. Thirdly, pharmacokinetic data showed that in vivo indomethacin percutaneous absorption of the FCT was much higher than that of the control, that of the FCTM was slightly higher than that of the CPE, and that of the FCTM group was significantly higher than all others. Meanwhile, variance analysis indicated that the combination of the FCT penetration enhancement method and the CPE method had beneficial effects in enhancing skin penetration: the significance level of the CPE method was 0.0004, which was lower than 0.001, meaning the difference was markedly significant; the significance level of the FCT was also below 0.0001 and its difference markedly significant. The significance level of factor interaction A × B was lower than 0.0001, indicating that the difference in synergism was markedly significant. Moreover, SEM and TEM images showed that the SC surfaces of Sprague-Dawley rats treated with FCT were damaged, and it was difficult to observe the complete surface structure, with SC pores growing larger and its special "brick structure" becoming looser. This indicated that the barrier function of the skin was broken, thus revealing a potentially major route of skin penetration. FCT, as a new form of transdermal penetration technology, has significant penetration effects with TCM characteristics and is of high clinical value. It is worth promoting its development.

  11. Computer-aided Nonlinear Control System Design Using Describing Function Models

    CERN Document Server

    Nassirharand, Amir

    2012-01-01

    A systematic computer-aided approach provides a versatile setting for the control engineer to overcome the complications of controller design for highly nonlinear systems. Computer-aided Nonlinear Control System Design provides such an approach based on the use of describing functions. The text deals with a large class of nonlinear systems without restrictions on the system order, the number of inputs and/or outputs or the number, type or arrangement of nonlinear terms. The strongly software-oriented methods detailed facilitate fulfillment of tight performance requirements and help the designer to think in purely nonlinear terms, avoiding the expedient of linearization which can impose substantial and unrealistic model limitations and drive up the cost of the final product. Design procedures are presented in a step-by-step algorithmic format each step being a functional unit with outputs that drive the other steps. This procedure may be easily implemented on a digital computer with example problems from mecha...

  12. Computation of bessel functions in light scattering studies.

    Science.gov (United States)

    Ross, W D

    1972-09-01

    Computations of light scattering require finding Bessel functions of a series of orders. These are found most easily by recurrence, but excessive rounding errors may accumulate. Satisfactory procedures for cylinder and sphere functions are described. If argument z is real, find Y(n)(z) by recurrence to high orders. From two high orders of Y(n)(z) estimate J(n)(z). Use backward recurrence to maximum J(n)(z). Correct by forward recurrence to maximum. If z is complex, estimate high orders of J(n)(z) without Y(n)(z) and use backward recurrence.

  13. Computing the hadronic vacuum polarization function by analytic continuation

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Xu [KEK National High Energy Physics, Tsukuba (Japan); Hashimoto, Shoji [KEK National High Energy Physics, Tsukuba (Japan); The Graduate Univ. for Advanced Studies, Tsukuba (Japan). School of High Energy Accelerator Science; Hotzel, Grit [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Petschlies, Marcus [The Cyprus Institute, Nicosia (Cyprus); Renner, Dru B. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    2013-07-15

    We propose a method to compute the hadronic vacuum polarization function on the lattice at continuous values of photon momenta bridging between the space-like and time-like regions. We provide two independent derivations of this method showing that it leads to the desired hadronic vacuum polarization function in Minkowski space-time. We show with the example of the leading- order QCD correction to the muon anomalous magnetic moment that this approach can provide a valuable alternative method for calculations of physical quantities where the hadronic vacuum polarization function enters.

  14. Numerical computation of special functions with applications to physics

    CSIR Research Space (South Africa)

    Motsepe, K

    2008-09-01

    Full Text Available Students of mathematical physics, engineering, natural and biological sciences sometimes need to use special functions that are not found in ordinary mathematical software. In this paper a simple universal numerical algorithm is developed to compute...

  15. Involvement of T6 pili in biofilm formation by serotype M6 Streptococcus pyogenes.

    Science.gov (United States)

    Kimura, Keiji Richard; Nakata, Masanobu; Sumitomo, Tomoko; Kreikemeyer, Bernd; Podbielski, Andreas; Terao, Yutaka; Kawabata, Shigetada

    2012-02-01

    The group A streptococcus (GAS) Streptococcus pyogenes is known to cause self-limiting purulent infections in humans. The role of GAS pili in host cell adhesion and biofilm formation is likely fundamental in early colonization. Pilus genes are found in the FCT (fibronectin-binding protein, collagen-binding protein, and trypsin-resistant antigen) genomic region, which has been classified into nine subtypes based on the diversity of gene content and nucleotide sequence. Several epidemiological studies have indicated that FCT type 1 strains, including serotype M6, produce large amounts of monospecies biofilm in vitro. We examined the direct involvement of pili in biofilm formation by serotype M6 clinical isolates. In the majority of tested strains, deletion of the tee6 gene encoding pilus shaft protein T6 compromised the ability to form biofilm on an abiotic surface. Deletion of the fctX and srtB genes, which encode pilus ancillary protein and class C pilus-associated sortase, respectively, also decreased biofilm formation by a representative strain. Unexpectedly, these mutant strains showed increased bacterial aggregation compared with that of the wild-type strain. When the entire FCT type 1 pilus region was ectopically expressed in serotype M1 strain SF370, biofilm formation was promoted and autoaggregation was inhibited. These findings indicate that assembled FCT type 1 pili contribute to biofilm formation and also function as attenuators of bacterial aggregation. Taken together, our results show the potential role of FCT type 1 pili in the pathogenesis of GAS infections.

  16. On computation and use of Fourier coefficients for associated Legendre functions

    Science.gov (United States)

    Gruber, Christian; Abrykosov, Oleh

    2016-06-01

    The computation of spherical harmonic series in very high resolution is known to be delicate in terms of performance and numerical stability. A major problem is to keep results inside a numerical range of the used data type during calculations as under-/overflow arises. Extended data types are currently not desirable since the arithmetic complexity will grow exponentially with higher resolution levels. If the associated Legendre functions are computed in the spectral domain, then regular grid transformations can be applied to be highly efficient and convenient for derived quantities as well. In this article, we compare three recursive computations of the associated Legendre functions as trigonometric series, thereby ensuring a defined numerical range for each constituent wave number, separately. The results to a high degree and order show the numerical strength of the proposed method. First, the evaluation of Fourier coefficients of the associated Legendre functions has been done with respect to the floating-point precision requirements. Secondly, the numerical accuracy in the cases of standard double and long double precision arithmetic is demonstrated. Following Bessel's inequality the obtained accuracy estimates of the Fourier coefficients are directly transferable to the associated Legendre functions themselves and to derived functionals as well. Therefore, they can provide an essential insight to modern geodetic applications that depend on efficient spherical harmonic analysis and synthesis beyond [5~× ~5] arcmin resolution.

  17. Studies on the zeros of Bessel functions and methods for their computation

    Science.gov (United States)

    Kerimov, M. K.

    2014-09-01

    The zeros of Bessel functions play an important role in computational mathematics, mathematical physics, and other areas of natural sciences. Studies addressing these zeros (their properties, computational methods) can be found in various sources. This paper offers a detailed overview of the results concerning the real zeros of the Bessel functions of the first and second kinds and general cylinder functions. The author intends to publish several overviews on this subject. In this first publication, works dealing with real zeros are analyzed. Primary emphasis is placed on classical results, which are still important. Some of the most recent publications are also discussed.

  18. Wigner functions and density matrices in curved spaces as computational tools

    International Nuclear Information System (INIS)

    Habib, S.; Kandrup, H.E.

    1989-01-01

    This paper contrasts two alternative approaches to statistical quantum field theory in curved spacetimes, namely (1) a canonical Hamiltonian approach, in which the basic object is a density matrix ρ characterizing the noncovariant, but globally defined, modes of the field; and (2) a Wigner function approach, in which the basic object is a Wigner function f defined quasilocally from the Hadamard, or correlation, function G 1 (x 1 , x 2 ). The key object is to isolate on the conceptual biases underlying each of these approaches and then to assess their utility and limitations in effecting concerete calculations. The following questions are therefore addressed and largely answered. What sort of spacetimes (e.g., de Sitter or Friedmann-Robertson-Walker) are comparatively eas to consider? What sorts of objects (e.g., average fields or renormalized stress energies) are easy to compute approximately? What, if anything, can be computed exactly? What approximations are intrinsic to each approach or convenient as computational tools? What sorts of ''field entropies'' are natural to define? copyright 1989 Academic Press, Inc

  19. Application of computer-generated functional (parametric) maps in radionuclide renography

    International Nuclear Information System (INIS)

    Agress, H. Jr.; Levenson, S.M.; Gelfand, M.J.; Green, M.V.; Bailey, J.J.; Johnston, G.S.

    1975-01-01

    A functional (parametric) map is a single visual display of regional dynamic phenomena which facilitates interpretation of the nature of focal abnormalities in renal function. Methods for producing several kinds of functional maps based on computer calculations of radionuclide scan data are briefly described. Three abnormal cases are presented to illustrate the use of functional maps to separate focal lesions and to specify the dynamic nature of the abnormalities in a way which is difficult to achieve with conventional sequential renal scans and renograms alone

  20. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    Science.gov (United States)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  1. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    Science.gov (United States)

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  2. Computations of nuclear response functions with MACK-IV

    International Nuclear Information System (INIS)

    Abdou, M.A.; Gohar, Y.

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications

  3. Computations of nuclear response functions with MACK-IV

    Energy Technology Data Exchange (ETDEWEB)

    Abdou, M A; Gohar, Y

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications.

  4. Interactive full channel teletext system for cable television nets

    Science.gov (United States)

    Vandenboom, H. P. A.

    1984-08-01

    A demonstration set-up of an interactive full channel teletext (FCT) system for cable TV networks with two-way data communication possibilities was designed and realized. In FCT all image lines are used for teletext data lines. The FCT decoder was placed in the mini-star, and the FCT encoder which provides the FCT signal was placed in the local center. From the FCT signal a number of data lines are selected using an extra FCT decoder. They are placed on the image lines reserved for teletext so that a normal TV receiver equipped with a teletext decoder, can process the selected data lines. For texts not on hand in the FCT signal, a command can be sent to the local center via the data communication path. A cheap and simple system is offered in which the number of commanded pages or books is in principle unlimited, while the used waiting time and channel capacity is limited.

  5. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  6. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Science.gov (United States)

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  7. Fast and accurate computation of projected two-point functions

    Science.gov (United States)

    Grasshorn Gebhardt, Henry S.; Jeong, Donghui

    2018-01-01

    We present the two-point function from the fast and accurate spherical Bessel transformation (2-FAST) algorithm1Our code is available at https://github.com/hsgg/twoFAST. for a fast and accurate computation of integrals involving one or two spherical Bessel functions. These types of integrals occur when projecting the galaxy power spectrum P (k ) onto the configuration space, ξℓν(r ), or spherical harmonic space, Cℓ(χ ,χ'). First, we employ the FFTLog transformation of the power spectrum to divide the calculation into P (k )-dependent coefficients and P (k )-independent integrations of basis functions multiplied by spherical Bessel functions. We find analytical expressions for the latter integrals in terms of special functions, for which recursion provides a fast and accurate evaluation. The algorithm, therefore, circumvents direct integration of highly oscillating spherical Bessel functions.

  8. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    In this report is described the work effort to develop and demonstrate a software framework to support advanced process simulations to evaluate the performance of advanced power systems. Integrated into the framework are a broad range of models, analysis tools, and visualization methods that can be used for the plant evaluation. The framework provides a tightly integrated problem-solving environment, with plug-and-play functionality, and includes a hierarchy of models, ranging from fast running process models to detailed reacting CFD models. The framework places no inherent limitations on the type of physics that can be modeled, numerical techniques, or programming languages used to implement the equipment models, or the type or amount of data that can be exchanged between models. Tools are provided to analyze simulation results at multiple levels of detail, ranging from simple tabular outputs to advanced solution visualization methods. All models and tools communicate in a seamless manner. The framework can be coupled to other software frameworks that provide different modeling capabilities. Three software frameworks were developed during the course of the project. The first framework focused on simulating the performance of the DOE Low Emissions Boiler System Proof of Concept facility, an advanced pulverized-coal combustion-based power plant. The second framework targeted simulating the performance of an Integrated coal Gasification Combined Cycle - Fuel Cell Turbine (IGCC-FCT) plant configuration. The coal gasifier models included both CFD and process models for the commercially dominant systems. Interfacing models to the framework was performed using VES-Open, and tests were performed to demonstrate interfacing CAPE-Open compliant models to the framework. The IGCC-FCT framework was subsequently extended to support Virtual Engineering concepts in which plant configurations can be constructed and interrogated in a three-dimensional, user-centered, interactive

  9. Renormalization group improved computation of correlation functions in theories with nontrivial phase diagram

    DEFF Research Database (Denmark)

    Codello, Alessandro; Tonero, Alberto

    2016-01-01

    We present a simple and consistent way to compute correlation functions in interacting theories with nontrivial phase diagram. As an example we show how to consistently compute the four-point function in three dimensional Z2-scalar theories. The idea is to perform the path integral by weighting...... the momentum modes that contribute to it according to their renormalization group (RG) relevance, i.e. we weight each mode according to the value of the running couplings at that scale. In this way, we are able to encode in a loop computation the information regarding the RG trajectory along which we...

  10. 76 FR 21908 - Statement of Organization, Functions, and Delegations of Authority

    Science.gov (United States)

    2011-04-19

    ... Public Engagement and the Office of Communications. The specific amendments to part F are described below... Communications (FCT): Office of Public Engagement (FCS) Serves as CMS' focal point for outreach to beneficiaries.... Coordinates stakeholder relations, community outreach, and public engagement with the CMS Regional Offices...

  11. Extended Krylov subspaces approximations of matrix functions. Application to computational electromagnetics

    Energy Technology Data Exchange (ETDEWEB)

    Druskin, V.; Lee, Ping [Schlumberger-Doll Research, Ridgefield, CT (United States); Knizhnerman, L. [Central Geophysical Expedition, Moscow (Russian Federation)

    1996-12-31

    There is now a growing interest in the area of using Krylov subspace approximations to compute the actions of matrix functions. The main application of this approach is the solution of ODE systems, obtained after discretization of partial differential equations by method of lines. In the event that the cost of computing the matrix inverse is relatively inexpensive, it is sometimes attractive to solve the ODE using the extended Krylov subspaces, originated by actions of both positive and negative matrix powers. Examples of such problems can be found frequently in computational electromagnetics.

  12. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  13. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  14. The Computational Processing of Intonational Prominence: A Functional Prosody Perspective

    OpenAIRE

    Nakatani, Christine Hisayo

    1997-01-01

    Intonational prominence, or accent, is a fundamental prosodic feature that is said to contribute to discourse meaning. This thesis outlines a new, computational theory of the discourse interpretation of prominence, from a FUNCTIONAL PROSODY perspective. Functional prosody makes the following two important assumptions: first, there is an aspect of prominence interpretation that centrally concerns discourse processes, namely the discourse focusing nature of prominence; and second, the role of p...

  15. A Functional Specification for a Programming Language for Computer Aided Learning Applications.

    Science.gov (United States)

    National Research Council of Canada, Ottawa (Ontario).

    In 1972 there were at least six different course authoring languages in use in Canada with little exchange of course materials between Computer Assisted Learning (CAL) centers. In order to improve facilities for producing "transportable" computer based course materials, a working panel undertook the definition of functional requirements of a user…

  16. Construction of renormalized coefficient functions of the Feynman diagrams by means of a computer

    International Nuclear Information System (INIS)

    Tarasov, O.V.

    1978-01-01

    An algorithm and short description of computer program, written in SCHOONSCHIP, are given. The program is assigned for construction of integrands of renormalized coefficient functions of the Feynman diagrams in scalar theories in the case of arbitrary subtraction point. For the given Feynman graph computer completely realizes the R-operation of Bogolubov-Parasjuk and gives the result as an integral over Feynman parameters. With the help of the program the time construction of the whole renormalized coefficient function is equal approximately 30 s on the CDC-6500 computer

  17. FUNCTIONING FEATURES OF COMPUTER TECHNOLOGY WHILE FORMING PRIMARY SCHOOLCHILDREN’S COMMUNICATIVE COMPETENCE

    Directory of Open Access Journals (Sweden)

    Olena Beskorsa

    2017-04-01

    Full Text Available The article reveals the problem of functioning features of computer technology while forming primary schoolchildren’s communicative competence whose relevance is proved by the increasing role of a foreign language as a means of communication and modernization of foreign language education. There is a great deal of publications devoted to the issue of foreign language learning at primary school by N. Biriukevych, O. Kolominova, O. Metolkina, O. Petrenko, V. Redko, S. Roman. Implementing of innovative technology as well as computer one is to intensify the language learning process and to improve young learners’ communicative skills. The aim of the article is to identify computer technology functioning features while forming primary schoolchildren communicative competence. In this study we follow the definition of the computer technology as an information technology whose implementation may be accompanied with a computer as one of the tools, excluding the use of audio and video equipment, projectors and other technical tools. Using computer technologies is realized due to a number of tools which are divided into two main groups: electronic learning materials; computer testing software. The analysis of current textbooks and learning and methodological complexes shows that teachers prefer authentic electronic materials to the national ones. The most available English learning materials are on the Internet and they are free. The author of the article discloses several on-line English learning tools and depict the opportunities to use them while forming primary schoolchildren’s communicative competence. Special attention is also paid to multimedia technology, its functioning features and multimedia lesson structure. Computer testing software provides tools for current and control assessing results of mastering language material, communicative skills, and self-assessing in an interactive way. For making tests for assessing English skill

  18. Computing many-body wave functions with guaranteed precision: the first-order Møller-Plesset wave function for the ground state of helium atom.

    Science.gov (United States)

    Bischoff, Florian A; Harrison, Robert J; Valeev, Edward F

    2012-09-14

    We present an approach to compute accurate correlation energies for atoms and molecules using an adaptive discontinuous spectral-element multiresolution representation for the two-electron wave function. Because of the exponential storage complexity of the spectral-element representation with the number of dimensions, a brute-force computation of two-electron (six-dimensional) wave functions with high precision was not practical. To overcome the key storage bottlenecks we utilized (1) a low-rank tensor approximation (specifically, the singular value decomposition) to compress the wave function, and (2) explicitly correlated R12-type terms in the wave function to regularize the Coulomb electron-electron singularities of the Hamiltonian. All operations necessary to solve the Schrödinger equation were expressed so that the reconstruction of the full-rank form of the wave function is never necessary. Numerical performance of the method was highlighted by computing the first-order Møller-Plesset wave function of a helium atom. The computed second-order Møller-Plesset energy is precise to ~2 microhartrees, which is at the precision limit of the existing general atomic-orbital-based approaches. Our approach does not assume special geometric symmetries, hence application to molecules is straightforward.

  19. Three-dimensional computed tomographic volumetry precisely predicts the postoperative pulmonary function.

    Science.gov (United States)

    Kobayashi, Keisuke; Saeki, Yusuke; Kitazawa, Shinsuke; Kobayashi, Naohiro; Kikuchi, Shinji; Goto, Yukinobu; Sakai, Mitsuaki; Sato, Yukio

    2017-11-01

    It is important to accurately predict the patient's postoperative pulmonary function. The aim of this study was to compare the accuracy of predictions of the postoperative residual pulmonary function obtained with three-dimensional computed tomographic (3D-CT) volumetry with that of predictions obtained with the conventional segment-counting method. Fifty-three patients scheduled to undergo lung cancer resection, pulmonary function tests, and computed tomography were enrolled in this study. The postoperative residual pulmonary function was predicted based on the segment-counting and 3D-CT volumetry methods. The predicted postoperative values were compared with the results of postoperative pulmonary function tests. Regarding the linear correlation coefficients between the predicted postoperative values and the measured values, those obtained using the 3D-CT volumetry method tended to be higher than those acquired using the segment-counting method. In addition, the variations between the predicted and measured values were smaller with the 3D-CT volumetry method than with the segment-counting method. These results were more obvious in COPD patients than in non-COPD patients. Our findings suggested that the 3D-CT volumetry was able to predict the residual pulmonary function more accurately than the segment-counting method, especially in patients with COPD. This method might lead to the selection of appropriate candidates for surgery among patients with a marginal pulmonary function.

  20. Supporting executive functions during children's preliteracy learning with the computer

    NARCIS (Netherlands)

    Sande, E. van de; Segers, P.C.J.; Verhoeven, L.T.W.

    2016-01-01

    The present study examined how embedded activities to support executive functions helped children to benefit from a computer intervention that targeted preliteracy skills. Three intervention groups were compared on their preliteracy gains in a randomized controlled trial design: an experimental

  1. Computational Benchmarking for Ultrafast Electron Dynamics: Wave Function Methods vs Density Functional Theory.

    Science.gov (United States)

    Oliveira, Micael J T; Mignolet, Benoit; Kus, Tomasz; Papadopoulos, Theodoros A; Remacle, F; Verstraete, Matthieu J

    2015-05-12

    Attosecond electron dynamics in small- and medium-sized molecules, induced by an ultrashort strong optical pulse, is studied computationally for a frozen nuclear geometry. The importance of exchange and correlation effects on the nonequilibrium electron dynamics induced by the interaction of the molecule with the strong optical pulse is analyzed by comparing the solution of the time-dependent Schrödinger equation based on the correlated field-free stationary electronic states computed with the equationof-motion coupled cluster singles and doubles and the complete active space multi-configurational self-consistent field methodologies on one hand, and various functionals in real-time time-dependent density functional theory (TDDFT) on the other. We aim to evaluate the performance of the latter approach, which is very widely used for nonlinear absorption processes and whose computational cost has a more favorable scaling with the system size. We focus on LiH as a toy model for a nontrivial molecule and show that our conclusions carry over to larger molecules, exemplified by ABCU (C10H19N). The molecules are probed with IR and UV pulses whose intensities are not strong enough to significantly ionize the system. By comparing the evolution of the time-dependent field-free electronic dipole moment, as well as its Fourier power spectrum, we show that TD-DFT performs qualitatively well in most cases. Contrary to previous studies, we find almost no changes in the TD-DFT excitation energies when excited states are populated. Transitions between states of different symmetries are induced using pulses polarized in different directions. We observe that the performance of TD-DFT does not depend on the symmetry of the states involved in the transition.

  2. Combining computer modelling and cardiac imaging to understand right ventricular pump function.

    Science.gov (United States)

    Walmsley, John; van Everdingen, Wouter; Cramer, Maarten J; Prinzen, Frits W; Delhaas, Tammo; Lumens, Joost

    2017-10-01

    Right ventricular (RV) dysfunction is a strong predictor of outcome in heart failure and is a key determinant of exercise capacity. Despite these crucial findings, the RV remains understudied in the clinical, experimental, and computer modelling literature. This review outlines how recent advances in using computer modelling and cardiac imaging synergistically help to understand RV function in health and disease. We begin by highlighting the complexity of interactions that make modelling the RV both challenging and necessary, and then summarize the multiscale modelling approaches used to date to simulate RV pump function in the context of these interactions. We go on to demonstrate how these modelling approaches in combination with cardiac imaging have improved understanding of RV pump function in pulmonary arterial hypertension, arrhythmogenic right ventricular cardiomyopathy, dyssynchronous heart failure and cardiac resynchronization therapy, hypoplastic left heart syndrome, and repaired tetralogy of Fallot. We conclude with a perspective on key issues to be addressed by computational models of the RV in the near future. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  3. A hybrid method for the parallel computation of Green's functions

    DEFF Research Database (Denmark)

    Petersen, Dan Erik; Li, Song; Stokbro, Kurt

    2009-01-01

    of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds...... of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only...... require computing a small number of entries of the inverse matrix. Then. we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size....

  4. Target localization on standard axial images in computed tomography (CT) stereotaxis for functional neurosurgery - a technical note

    International Nuclear Information System (INIS)

    Patil, A.-A.

    1986-01-01

    A simple technique for marking functional neurosurgery target on computed tomography (CT) axial image is described. This permits the use of standard axial image for computed tomography (CT) stereotaxis in functional neurosurgery. (Author)

  5. Applications of computed nuclear structure functions to inclusive scattering, R-ratios and their moments

    International Nuclear Information System (INIS)

    Rinat, A.S.

    2000-01-01

    We discuss applications of previously computed nuclear structure functions (SF) to inclusive cross sections, compare predictions with recent CEBAF data and perform two scaling tests. We mention that the large Q 2 plateau of scaling functions may only in part be due to the asymptotic limit of SF, which prevents the extraction of the nucleon momentum distribution in a model- independent way. We show that there may be sizable discrepancies between computed and semi-heuristic estimates of SF ratios. We compute ratios of moments of nuclear SF and show these to be in reasonable agreement with data. We speculate that an effective theory may underly the model for the nuclear SF, which produces overall agreement with several observables. (author)

  6. Computer algebra in quantum field theory integration, summation and special functions

    CERN Document Server

    Schneider, Carsten

    2013-01-01

    The book focuses on advanced computer algebra methods and special functions that have striking applications in the context of quantum field theory. It presents the state of the art and new methods for (infinite) multiple sums, multiple integrals, in particular Feynman integrals, difference and differential equations in the format of survey articles. The presented techniques emerge from interdisciplinary fields: mathematics, computer science and theoretical physics; the articles are written by mathematicians and physicists with the goal that both groups can learn from the other field, including

  7. Computer Processing and Display of Positron Scintigrams and Dynamic Function Curves

    Energy Technology Data Exchange (ETDEWEB)

    Wilensky, S.; Ashare, A. B.; Pizer, S. M.; Hoop, B. Jr.; Brownell, G. L. [Massachusetts General Hospital, Boston, MA (United States)

    1969-01-15

    A computer processing and display system for handling radioisotope data is described. The system has been used to upgrade and display brain scans and to process dynamic function curves. The hardware and software are described, and results are presented. (author)

  8. A fast computation method for MUSIC spectrum function based on circular arrays

    Science.gov (United States)

    Du, Zhengdong; Wei, Ping

    2015-02-01

    The large computation amount of multiple signal classification (MUSIC) spectrum function seriously affects the timeliness of direction finding system using MUSIC algorithm, especially in the two-dimensional directions of arrival (DOA) estimation of azimuth and elevation with a large antenna array. This paper proposes a fast computation method for MUSIC spectrum. It is suitable for any circular array. First, the circular array is transformed into a virtual uniform circular array, in the process of calculating MUSIC spectrum, for the cyclic characteristics of steering vector, the inner product in the calculation of spatial spectrum is realised by cyclic convolution. The computational amount of MUSIC spectrum is obviously less than that of the conventional method. It is a very practical way for MUSIC spectrum computation in circular arrays.

  9. International assessment of functional computer abilities

    OpenAIRE

    Anderson, Ronald E.; Collis, Betty

    1993-01-01

    After delineating the major rationale for computer education, data are presented from Stage 1 of the IEA Computers in Education Study showing international comparisons that may reflect differential priorities. Rapid technological change and the lack of consensus on goals of computer education impedes the establishment of stable curricula for ¿general computer education¿ or computer literacy. In this context the construction of instruments for student assessment remains a challenge. Seeking to...

  10. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    Science.gov (United States)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  11. A new algorithm to compute conjectured supply function equilibrium in electricity markets

    International Nuclear Information System (INIS)

    Diaz, Cristian A.; Villar, Jose; Campos, Fco Alberto; Rodriguez, M. Angel

    2011-01-01

    Several types of market equilibria approaches, such as Cournot, Conjectural Variation (CVE), Supply Function (SFE) or Conjectured Supply Function (CSFE) have been used to model electricity markets for the medium and long term. Among them, CSFE has been proposed as a generalization of the classic Cournot. It computes the equilibrium considering the reaction of the competitors against changes in their strategy, combining several characteristics of both CVE and SFE. Unlike linear SFE approaches, strategies are linearized only at the equilibrium point, using their first-order Taylor approximation. But to solve CSFE, the slope or the intercept of the linear approximations must be given, which has been proved to be very restrictive. This paper proposes a new algorithm to compute CSFE. Unlike previous approaches, the main contribution is that the competitors' strategies for each generator are initially unknown (both slope and intercept) and endogenously computed by this new iterative algorithm. To show the applicability of the proposed approach, it has been applied to several case examples where its qualitative behavior has been analyzed in detail. (author)

  12. Using computational models to relate structural and functional brain connectivity

    Czech Academy of Sciences Publication Activity Database

    Hlinka, Jaroslav; Coombes, S.

    2012-01-01

    Roč. 36, č. 2 (2012), s. 2137-2145 ISSN 0953-816X R&D Projects: GA MŠk 7E08027 EU Projects: European Commission(XE) 200728 - BRAINSYNC Institutional research plan: CEZ:AV0Z10300504 Keywords : brain disease * computational modelling * functional connectivity * graph theory * structural connectivity Subject RIV: FH - Neurology Impact factor: 3.753, year: 2012

  13. Computation of Value Functions in Nonlinear Differential Games with State Constraints

    KAUST Repository

    Botkin, Nikolai

    2013-01-01

    Finite-difference schemes for the computation of value functions of nonlinear differential games with non-terminal payoff functional and state constraints are proposed. The solution method is based on the fact that the value function is a generalized viscosity solution of the corresponding Hamilton-Jacobi-Bellman-Isaacs equation. Such a viscosity solution is defined as a function satisfying differential inequalities introduced by M. G. Crandall and P. L. Lions. The difference with the classical case is that these inequalities hold on an unknown in advance subset of the state space. The convergence rate of the numerical schemes is given. Numerical solution to a non-trivial three-dimensional example is presented. © 2013 IFIP International Federation for Information Processing.

  14. Renal parenchyma thickness: a rapid estimation of renal function on computed tomography

    International Nuclear Information System (INIS)

    Kaplon, Daniel M.; Lasser, Michael S.; Sigman, Mark; Haleblian, George E.; Pareek, Gyan

    2009-01-01

    Purpose: To define the relationship between renal parenchyma thickness (RPT) on computed tomography and renal function on nuclear renography in chronically obstructed renal units (ORUs) and to define a minimal thickness ratio associated with adequate function. Materials and Methods: Twenty-eight consecutive patients undergoing both nuclear renography and CT during a six-month period between 2004 and 2006 were included. All patients that had a diagnosis of unilateral obstruction were included for analysis. RPT was measured in the following manner: The parenchyma thickness at three discrete levels of each kidney was measured using calipers on a CT workstation. The mean of these three measurements was defined as RPT. The renal parenchyma thickness ratio of the ORUs and non-obstructed renal unit (NORUs) was calculated and this was compared to the observed function on Mag-3 lasix Renogram. Results: A total of 28 patients were evaluated. Mean parenchyma thickness was 1.82 cm and 2.25 cm in the ORUs and NORUs, respectively. The mean relative renal function of ORUs was 39%. Linear regression analysis comparing renogram function to RPT ratio revealed a correlation coefficient of 0.48 (p * RPT ratio. A thickness ratio of 0.68 correlated with 20% renal function. Conclusion: RPT on computed tomography appears to be a powerful predictor of relative renal function in ORUs. Assessment of RPT is a useful and readily available clinical tool for surgical decision making (renal salvage therapy versus nephrectomy) in patients with ORUs. (author)

  15. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    Science.gov (United States)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  16. Structure, function, and behaviour of computational models in systems biology.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  17. Onset of magnetic interface exchange interactions in epitaxially grown Mn-Co(001)

    NARCIS (Netherlands)

    Kohlhepp, J.T.; Wieldraaijer, H.; Jonge, de W.J.M.

    2007-01-01

    Manganese (Mn) grows in a metastable expanded (c/a > 1) face-centered-tetragonal (fct) phase on thin fct-Co(001) template films. A layer-by-layer growth mode is obsd. for small Mn thicknesses. Antiferromagnetism (AFM) of fct-Mn is evidenced by the observation of shifted magnetization loops

  18. On algorithmic equivalence of instruction sequences for computing bit string functions

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    Every partial function from bit strings of a given length to bit strings of a possibly different given length can be computed by a finite instruction sequence that contains only instructions to set and get the content of Boolean registers, forward jump instructions, and a termination instruction. We

  19. On algorithmic equivalence of instruction sequences for computing bit string functions

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2014-01-01

    Every partial function from bit strings of a given length to bit strings of a possibly different given length can be computed by a finite instruction sequence that contains only instructions to set and get the content of Boolean registers, forward jump instructions, and a termination instruction. We

  20. Recent progress in orbital-free density functional theory (recent advances in computational chemistry)

    CERN Document Server

    Wesolowski, Tomasz A

    2013-01-01

    This is a comprehensive overview of state-of-the-art computational methods based on orbital-free formulation of density functional theory completed by the most recent developments concerning the exact properties, approximations, and interpretations of the relevant quantities in density functional theory. The book is a compilation of contributions stemming from a series of workshops which had been taking place since 2002. It not only chronicles many of the latest developments but also summarises some of the more significant ones. The chapters are mainly reviews of sub-domains but also include original research. Readership: Graduate students, academics and researchers in computational chemistry. Atomic & molecular physicists, theoretical physicists, theoretical chemists, physical chemists and chemical physicists.

  1. International assessment of functional computer abilities

    NARCIS (Netherlands)

    Anderson, Ronald E.; Collis, Betty

    1993-01-01

    After delineating the major rationale for computer education, data are presented from Stage 1 of the IEA Computers in Education Study showing international comparisons that may reflect differential priorities. Rapid technological change and the lack of consensus on goals of computer education

  2. Functioning strategy study on control systems of large physical installations used with a digital computer

    International Nuclear Information System (INIS)

    Bel'man, L.B.; Lavrikov, S.A.; Lenskij, O.D.

    1975-01-01

    A criterion to evaluate the efficiency of a control system functioning of large physical installations by means of a control computer. The criteria are the object utilization factor and computer load factor. Different strategies of control system functioning are described, and their comparative analysis is made. A choice of such important parameters as sampling time and parameter correction time is made. A single factor to evaluate the system functioning efficiency is introduced and its dependence on the sampling interval value is given. Using diagrams attached, it is easy to find the optimum value of the sampling interval and the corresponding maximum value of the single efficiency factor proposed

  3. Conical : An extended module for computing a numerically satisfactory pair of solutions of the differential equation for conical functions

    NARCIS (Netherlands)

    T.M. Dunster (Mark); A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2017-01-01

    textabstractConical functions appear in a large number of applications in physics and engineering. In this paper we describe an extension of our module Conical (Gil et al., 2012) for the computation of conical functions. Specifically, the module includes now a routine for computing the function

  4. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Science.gov (United States)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  5. Encoding neural and synaptic functionalities in electron spin: A pathway to efficient neuromorphic computing

    Science.gov (United States)

    Sengupta, Abhronil; Roy, Kaushik

    2017-12-01

    Present day computers expend orders of magnitude more computational resources to perform various cognitive and perception related tasks that humans routinely perform every day. This has recently resulted in a seismic shift in the field of computation where research efforts are being directed to develop a neurocomputer that attempts to mimic the human brain by nanoelectronic components and thereby harness its efficiency in recognition problems. Bridging the gap between neuroscience and nanoelectronics, this paper attempts to provide a review of the recent developments in the field of spintronic device based neuromorphic computing. Description of various spin-transfer torque mechanisms that can be potentially utilized for realizing device structures mimicking neural and synaptic functionalities is provided. A cross-layer perspective extending from the device to the circuit and system level is presented to envision the design of an All-Spin neuromorphic processor enabled with on-chip learning functionalities. Device-circuit-algorithm co-simulation framework calibrated to experimental results suggest that such All-Spin neuromorphic systems can potentially achieve almost two orders of magnitude energy improvement in comparison to state-of-the-art CMOS implementations.

  6. On the Computation and Applications of Bessel Functions with Pure Imaginary Indices

    OpenAIRE

    Matyshev, A. A.; Fohtung, E.

    2009-01-01

    Bessel functions with pure imaginary index (order) play an important role in corpuscular optics where they govern the dynamics of charged particles in isotrajectory quadrupoles. Recently they were found to be of great importance in semiconductor material characterization as they are manifested in the strain state of crystalline material. A new algorithm which can be used for the computation of the normal and modifed Bessel functions with pure imaginary index is proposed. The developed algorit...

  7. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    International Nuclear Information System (INIS)

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  8. Evaluation of left ventricular function and volume with multidetector-row computed tomography. Comparison with electrocardiogram-gated single photon emission computed tomography

    International Nuclear Information System (INIS)

    Suzuki, Takeya; Yamashina, Shohei; Nanjou, Shuji; Yamazaki, Junichi

    2007-01-01

    This study compared left ventricular systolic function and volume determined by multidetector-row computed tomography (MDCT) and electrocardiogram-gated single photon emission computed tomography (G-SPECT) Thirty-seven patients with coronary artery disease and non-cardiovascular disease underwent MDCT. In this study, left ventricular ejection fraction (EF), left ventricular end-diastolic volume (EDV) and left ventricular end-systolic volume (ESV) were calculated using only two-phase imaging with MDCT. Left ventricular function and volume were compared using measurements from G-SPECT. We conducted MDCT and G-SPECT virtually simultaneously. Both the EF and ESV evaluated by MDCT closely correlated with G-SPECT (r=0.763, P 65 bpm) during MDCT significantly influenced the difference in EF calculated from MDCT and G-SPECT (P<0.05). Left ventricular function can be measured with MDCT as well as G-SPECT. However, a heart rate over 65 bpm during MDCT negatively affects the EF correlation between MDCT and G-SPECT. (author)

  9. Computing single step operators of logic programming in radial basis function neural networks

    Science.gov (United States)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  10. Computing single step operators of logic programming in radial basis function neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong [School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia)

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  11. Computing single step operators of logic programming in radial basis function neural networks

    International Nuclear Information System (INIS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-01-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T p :I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks

  12. Facilitating tolerance of delayed reinforcement during functional communication training.

    Science.gov (United States)

    Fisher, W W; Thompson, R H; Hagopian, L P; Bowman, L G; Krug, A

    2000-01-01

    Few clinical investigations have addressed the problem of delayed reinforcement. In this investigation, three individuals whose destructive behavior was maintained by positive reinforcement were treated using functional communication training (FCT) with extinction (EXT). Next, procedures used in the basic literature on delayed reinforcement and self-control (reinforcer delay fading, punishment of impulsive responding, and provision of an alternative activity during reinforcer delay) were used to teach participants to tolerate delayed reinforcement. With the first case, reinforcer delay fading alone was effective at maintaining low rates of destructive behavior while introducing delayed reinforcement. In the second case, the addition of a punishment component reduced destructive behavior to near-zero levels and facilitated reinforcer delay fading. With the third case, reinforcer delay fading was associated with increases in masturbation and head rolling, but prompting and praising the individual for completing work during the delay interval reduced all problem behaviors and facilitated reinforcer delay fading.

  13. Computational Fluid Dynamics Simulation of Combustion Instability in Solid Rocket Motor : Implementation of Pressure Coupled Response Function

    OpenAIRE

    S. Saha; D. Chakraborty

    2016-01-01

    Combustion instability in solid propellant rocket motor is numerically simulated by implementing propellant response function with quasi steady homogeneous one dimensional formulation. The convolution integral of propellant response with pressure history is implemented through a user defined function in commercial computational fluid dynamics software. The methodology is validated against literature reported motor test and other simulation results. Computed amplitude of pressure fluctuations ...

  14. Stochastic methods for uncertainty treatment of functional variables in computer codes: application to safety studies

    International Nuclear Information System (INIS)

    Nanty, Simon

    2015-01-01

    This work relates to the framework of uncertainty quantification for numerical simulators, and more precisely studies two industrial applications linked to the safety studies of nuclear plants. These two applications have several common features. The first one is that the computer code inputs are functional and scalar variables, functional ones being dependent. The second feature is that the probability distribution of functional variables is known only through a sample of their realizations. The third feature, relative to only one of the two applications, is the high computational cost of the code, which limits the number of possible simulations. The main objective of this work was to propose a complete methodology for the uncertainty analysis of numerical simulators for the two considered cases. First, we have proposed a methodology to quantify the uncertainties of dependent functional random variables from a sample of their realizations. This methodology enables to both model the dependency between variables and their link to another variable, called co-variate, which could be, for instance, the output of the considered code. Then, we have developed an adaptation of a visualization tool for functional data, which enables to simultaneously visualize the uncertainties and features of dependent functional variables. Second, a method to perform the global sensitivity analysis of the codes used in the two studied cases has been proposed. In the case of a computationally demanding code, the direct use of quantitative global sensitivity analysis methods is intractable. To overcome this issue, the retained solution consists in building a surrogate model or meta model, a fast-running model approximating the computationally expensive code. An optimized uniform sampling strategy for scalar and functional variables has been developed to build a learning basis for the meta model. Finally, a new approximation approach for expensive codes with functional outputs has been

  15. Quantum computation and analysis of Wigner and Husimi functions: toward a quantum image treatment.

    Science.gov (United States)

    Terraneo, M; Georgeot, B; Shepelyansky, D L

    2005-06-01

    We study the efficiency of quantum algorithms which aim at obtaining phase-space distribution functions of quantum systems. Wigner and Husimi functions are considered. Different quantum algorithms are envisioned to build these functions, and compared with the classical computation. Different procedures to extract more efficiently information from the final wave function of these algorithms are studied, including coarse-grained measurements, amplitude amplification, and measure of wavelet-transformed wave function. The algorithms are analyzed and numerically tested on a complex quantum system showing different behavior depending on parameters: namely, the kicked rotator. The results for the Wigner function show in particular that the use of the quantum wavelet transform gives a polynomial gain over classical computation. For the Husimi distribution, the gain is much larger than for the Wigner function and is larger with the help of amplitude amplification and wavelet transforms. We discuss the generalization of these results to the simulation of other quantum systems. We also apply the same set of techniques to the analysis of real images. The results show that the use of the quantum wavelet transform allows one to lower dramatically the number of measurements needed, but at the cost of a large loss of information.

  16. Computing the effective action with the functional renormalization group

    Energy Technology Data Exchange (ETDEWEB)

    Codello, Alessandro [CP3-Origins and the Danish IAS University of Southern Denmark, Odense (Denmark); Percacci, Roberto [SISSA, Trieste (Italy); INFN, Sezione di Trieste, Trieste (Italy); Rachwal, Leslaw [Fudan University, Department of Physics, Center for Field Theory and Particle Physics, Shanghai (China); Tonero, Alberto [ICTP-SAIFR and IFT, Sao Paulo (Brazil)

    2016-04-15

    The ''exact'' or ''functional'' renormalization group equation describes the renormalization group flow of the effective average action Γ{sub k}. The ordinary effective action Γ{sub 0} can be obtained by integrating the flow equation from an ultraviolet scale k = Λ down to k = 0. We give several examples of such calculations at one-loop, both in renormalizable and in effective field theories. We reproduce the four-point scattering amplitude in the case of a real scalar field theory with quartic potential and in the case of the pion chiral Lagrangian. In the case of gauge theories, we reproduce the vacuum polarization of QED and of Yang-Mills theory. We also compute the two-point functions for scalars and gravitons in the effective field theory of scalar fields minimally coupled to gravity. (orig.)

  17. Computational prediction of drug-drug interactions based on drugs functional similarities.

    Science.gov (United States)

    Ferdousi, Reza; Safdari, Reza; Omidi, Yadollah

    2017-06-01

    Therapeutic activities of drugs are often influenced by co-administration of drugs that may cause inevitable drug-drug interactions (DDIs) and inadvertent side effects. Prediction and identification of DDIs are extremely vital for the patient safety and success of treatment modalities. A number of computational methods have been employed for the prediction of DDIs based on drugs structures and/or functions. Here, we report on a computational method for DDIs prediction based on functional similarity of drugs. The model was set based on key biological elements including carriers, transporters, enzymes and targets (CTET). The model was applied for 2189 approved drugs. For each drug, all the associated CTETs were collected, and the corresponding binary vectors were constructed to determine the DDIs. Various similarity measures were conducted to detect DDIs. Of the examined similarity methods, the inner product-based similarity measures (IPSMs) were found to provide improved prediction values. Altogether, 2,394,766 potential drug pairs interactions were studied. The model was able to predict over 250,000 unknown potential DDIs. Upon our findings, we propose the current method as a robust, yet simple and fast, universal in silico approach for identification of DDIs. We envision that this proposed method can be used as a practical technique for the detection of possible DDIs based on the functional similarities of drugs. Copyright © 2017. Published by Elsevier Inc.

  18. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    International Nuclear Information System (INIS)

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  19. Computing the Kummer function $U(a,b,z)$ for small values of the arguments

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2015-01-01

    textabstractWe describe methods for computing the Kummer function $U(a,b,z)$ for small values of $z$, with special attention to small values of $b$. For these values of $b$ the connection formula that represents $U(a,b,z)$ as a linear combination of two ${}_1F_1$-functions needs a limiting

  20. Gaussian Radial Basis Function for Efficient Computation of Forest Indirect Illumination

    Science.gov (United States)

    Abbas, Fayçal; Babahenini, Mohamed Chaouki

    2018-06-01

    Global illumination of natural scenes in real time like forests is one of the most complex problems to solve, because the multiple inter-reflections between the light and material of the objects composing the scene. The major problem that arises is the problem of visibility computation. In fact, the computing of visibility is carried out for all the set of leaves visible from the center of a given leaf, given the enormous number of leaves present in a tree, this computation performed for each leaf of the tree which also reduces performance. We describe a new approach that approximates visibility queries, which precede in two steps. The first step is to generate point cloud representing the foliage. We assume that the point cloud is composed of two classes (visible, not-visible) non-linearly separable. The second step is to perform a point cloud classification by applying the Gaussian radial basis function, which measures the similarity in term of distance between each leaf and a landmark leaf. It allows approximating the visibility requests to extract the leaves that will be used to calculate the amount of indirect illumination exchanged between neighbor leaves. Our approach allows efficiently treat the light exchanges in the scene of a forest, it allows a fast computation and produces images of good visual quality, all this takes advantage of the immense power of computation of the GPU.

  1. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    International Nuclear Information System (INIS)

    Ranken, D.; George, J.

    1993-01-01

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities

  2. Functional magnetic resonance maps obtained by personal computer

    International Nuclear Information System (INIS)

    Gomez, F. j.; Manjon, J. V.; Robles, M.; Marti-Bonmati, L.; Dosda, R.; Molla, E.

    2001-01-01

    Functional magnetic resonance (fMR) is of special relevance in the analysis of certain types of brain activation. The present report describes the development of a simple software program for use with personal computers (PCs) that analyzes these images and provides functional activation maps. Activation maps are based on the temporal differences in oxyhemoglobin in tomographic images. To detect these differences, intensities registered repeatedly during brain control and activation are compared. The experiments were performed with a 1.5-Tesla MR unit. To verify the reliability of the program fMR studies were carried out in 4 healthy individuals (12 contiguous slices, 80 images per slice every 3.1 seconds for a total of 960 images). All the images were transferred to a PC and were processed pixel by pixel within each sequence to obtain an intensity/time curve. The statistical study of the results (Student's test and cross correlation analysis) made it possible to establish the activation of each pixel. The images were prepared using spatial filtering, temporal filtering, baseline correction, normalization and segmentation of the parenchyma. The postprocessing of the results involved the elimination of single pixels, superposition of an anatomical image of greater spatial resolution and anti-aliasing. The application (Xfun 1.0, Valencia, Spain) was developed in Microsoft Visual C++5.0 Developer Studio for Windows NT Workstation. As a representative example, the program took 8.2 seconds to calculate and present the results of the entire study (12 functional maps). In the motor and visual activation experiments, the activation corresponding to regions proximal to the central sulcus of the hemisphere contralateral to the hand that moved and in the occipital cortex were observed. While programs that calculate activation maps are available, the development of software for PCs running Microsoft Windows ensures several key features for its use on a daily basis: it is easy

  3. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  4. On The Computation Of The Best-fit Okada-type Tsunami Source

    Science.gov (United States)

    Miranda, J. M. A.; Luis, J. M. F.; Baptista, M. A.

    2017-12-01

    The forward simulation of earthquake-induced tsunamis usually assumes that the initial sea surface elevation mimics the co-seismic deformation of the ocean bottom described by a simple "Okada-type" source (rectangular fault with constant slip in a homogeneous elastic half space). This approach is highly effective, in particular in far-field conditions. With this assumption, and a given set of tsunami waveforms recorded by deep sea pressure sensors and (or) coastal tide stations it is possible to deduce the set of parameters of the Okada-type solution that best fits a set of sea level observations. To do this, we build a "space of possible tsunami sources-solution space". Each solution consists of a combination of parameters: earthquake magnitude, length, width, slip, depth and angles - strike, rake, and dip. To constrain the number of possible solutions we use the earthquake parameters defined by seismology and establish a range of possible values for each parameter. We select the "best Okada source" by comparison of the results of direct tsunami modeling using the solution space of tsunami sources. However, direct tsunami modeling is a time-consuming process for the whole solution space. To overcome this problem, we use a precomputed database of Empirical Green Functions to compute the tsunami waveforms resulting from unit water sources and search which one best matches the observations. In this study, we use as a test case the Solomon Islands tsunami of 6 February 2013 caused by a magnitude 8.0 earthquake. The "best Okada" source is the solution that best matches the tsunami recorded at six DART stations in the area. We discuss the differences between the initial seismic solution and the final one obtained from tsunami data This publication received funding of FCT-project UID/GEO/50019/2013-Instituto Dom Luiz.

  5. Computing the functional proteome

    DEFF Research Database (Denmark)

    O'Brien, Edward J.; Palsson, Bernhard

    2015-01-01

    Constraint-based models enable the computation of feasible, optimal, and realized biological phenotypes from reaction network reconstructions and constraints on their operation. To date, stoichiometric reconstructions have largely focused on metabolism, resulting in genome-scale metabolic models (M...

  6. SAMARU JOURNALx

    African Journals Online (AJOL)

    USER

    teacher librarian's performance in FCT secondary schools should be maintained since it was found to increase their level of job ... could also be as a result of dissatisfaction among the staff. Job ... job satisfaction will affect the functioning and.

  7. Development and functional demonstration of a wireless intraoral inductive tongue computer interface for severely disabled persons.

    Science.gov (United States)

    N S Andreasen Struijk, Lotte; Lontis, Eugen R; Gaihede, Michael; Caltenco, Hector A; Lund, Morten Enemark; Schioeler, Henrik; Bentsen, Bo

    2017-08-01

    Individuals with tetraplegia depend on alternative interfaces in order to control computers and other electronic equipment. Current interfaces are often limited in the number of available control commands, and may compromise the social identity of an individual due to their undesirable appearance. The purpose of this study was to implement an alternative computer interface, which was fully embedded into the oral cavity and which provided multiple control commands. The development of a wireless, intraoral, inductive tongue computer was described. The interface encompassed a 10-key keypad area and a mouse pad area. This system was embedded wirelessly into the oral cavity of the user. The functionality of the system was demonstrated in two tetraplegic individuals and two able-bodied individuals Results: The system was invisible during use and allowed the user to type on a computer using either the keypad area or the mouse pad. The maximal typing rate was 1.8 s for repetitively typing a correct character with the keypad area and 1.4 s for repetitively typing a correct character with the mouse pad area. The results suggest that this inductive tongue computer interface provides an esthetically acceptable and functionally efficient environmental control for a severely disabled user. Implications for Rehabilitation New Design, Implementation and detection methods for intra oral assistive devices. Demonstration of wireless, powering and encapsulation techniques suitable for intra oral embedment of assistive devices. Demonstration of the functionality of a rechargeable and fully embedded intra oral tongue controlled computer input device.

  8. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  9. Cognitive assessment of executive functions using brain computer interface and eye-tracking

    Directory of Open Access Journals (Sweden)

    P. Cipresso

    2013-03-01

    Full Text Available New technologies to enable augmentative and alternative communication in Amyotrophic Lateral Sclerosis (ALS have been recently used in several studies. However, a comprehensive battery for cognitive assessment has not been implemented yet. Brain computer interfaces are innovative systems able to generate a control signal from brain responses conveying messages directly to a computer. Another available technology for communication purposes is the Eye-tracker system, that conveys messages from eye-movement to a computer. In this study we explored the use of these two technologies for the cognitive assessment of executive functions in a healthy population and in a ALS patient, also verifying usability, pleasantness, fatigue, and emotional aspects related to the setting. Our preliminary results may have interesting implications for both clinical practice (the availability of an effective tool for neuropsychological evaluation of ALS patients and ethical issues.

  10. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    Science.gov (United States)

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  11. Metaanalysis of Diagnostic Performance of Computed Coronary Tomography Angiography, Computed Tomography Perfusion and Computed Tomography-Fractional Flow Reserve in Functional Myocardial Ischemia Assessment versus Invasive Fractional Flow Reserve

    Science.gov (United States)

    Gonzalez, Jorge A.; Lipinski, Michael J.; Flors, Lucia F.; Shaw, Peter; Kramer, Christopher M.; Salerno, Michael

    2015-01-01

    We sought to compare the diagnostic performance of computed coronary tomography angiography (CCTA), computed tomography perfusion (CTP) and computed tomography fractional flow reserve (CT-FFR) for assessing the functional significance of coronary stenosis as defined by invasive fractional flow reserve (FFR), in patients with known or suspected coronary artery disease. CCTA has proven clinically useful for excluding obstructive CAD due to its high sensitivity and negative predictive value (NPV), however the ability of CTA to identify functionally significant CAD has remained challenging. We searched PubMed/Medline for studies evaluating CCTA, CTP or CT-FFR for the non-invasive detection of obstructive CAD as compared to catheter-derived FFR as the reference standard. Pooled sensitivity, specificity, PPV, NPV, likelihood ratios (LR), odds ratio (OR) of all diagnostic tests were assessed. Eighteen studies involving a total of 1535 patients were included. CTA demonstrated a pooled sensitivity of 0.92, specificity 0.43, PPV of 0.56 and NPV of 0.87 on a per-patient level. CT-FFR and CTP increased the specificity to 0.72 and 0.77 respectively (P=0.004 and P=0.0009)) resulting in higher point estimates for PPV 0.70 and 0.83 respectively. There was no improvement in the sensitivity. The CTP protocol involved more radiation (3.5 mSv CCTA VS 9.6 mSv CTP) and a higher volume of iodinated contrast (145 mL). In conclusion, CTP and CT-FFR improve the specificity of CCTA for detecting functionally significant stenosis as defined by invasive FFR on a per-patient level; both techniques could advance the ability to non-invasively detect the functional significance of coronary lesions. PMID:26347004

  12. Computer Modeling of the Earliest Cellular Structures and Functions

    Science.gov (United States)

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membranestructures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  13. Exact fast computation of band depth for large functional datasets: How quickly can one million curves be ranked?

    KAUST Repository

    Sun, Ying

    2012-10-01

    © 2012 John Wiley & Sons, Ltd. Band depth is an important nonparametric measure that generalizes order statistics and makes univariate methods based on order statistics possible for functional data. However, the computational burden of band depth limits its applicability when large functional or image datasets are considered. This paper proposes an exact fast method to speed up the band depth computation when bands are defined by two curves. Remarkable computational gains are demonstrated through simulation studies comparing our proposal with the original computation and one existing approximate method. For example, we report an experiment where our method can rank one million curves, evaluated at fifty time points each, in 12.4 seconds with Matlab.

  14. Computer-Based Techniques for Collection of Pulmonary Function Variables during Rest and Exercise.

    Science.gov (United States)

    1991-03-01

    routinely Included in experimental protocols involving hyper- and hypobaric excursions. Unfortunately, the full potential of those tests Is often not...for a Pulmonary Function data acquisition system that has proven useful in the hyperbaric research laboratory. It illustrates how computers can

  15. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    Science.gov (United States)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization

  16. Functional imaging using computer methods to compare the effect of salbutamol and ipratropium bromide in patient-specific airway models of COPD

    Directory of Open Access Journals (Sweden)

    De Backer LA

    2011-11-01

    Full Text Available LA De Backer1, WG Vos2, R Salgado3, JW De Backer2, A Devolder1, SL Verhulst1, R Claes1, PR Germonpré1, WA De Backer11Department of Respiratory Medicine, 2FluidDA, 3Department of Radiology, Antwerp University Hospital, Antwerp, BelgiumBackground: Salbutamol and ipratropium bromide improve lung function in patients with chronic obstructive pulmonary disease (COPD. However, their bronchodilating effect has not yet been compared in the central and distal airways. Functional imaging using computational fluid dynamics offers the possibility of making such a comparison. The objective of this study was to assess the effects of salbutamol and ipratropium bromide on the geometry and computational fluid dynamics-based resistance of the central and distal airways.Methods: Five patients with Global Initiative for Chronic Obstructive Lung Disease Stage III COPD were randomized to a single dose of salbutamol or ipratropium bromide in a crossover manner with a 1-week interval between treatments. Patients underwent lung function testing and a multislice computed tomography scan of the thorax that was used for functional imaging. Two hours after dosing, the patients again underwent lung function tests and repeat computed tomography.Results: Lung function parameters, including forced expiratory volume in 1 second, vital capacity, overall airway resistance, and specific airway resistance, changed significantly after administration of each product. On functional imaging, the bronchodilating effect was greater in the distal airways, with a corresponding drop in airway resistance, compared with the central airways. Salbutamol and ipratropium bromide were equally effective at first glance when looking at lung function tests, but when viewed in more detail with functional imaging, hyporesponsiveness could be shown for salbutamol in one patient. Salbutamol was more effective in the other patients.Conclusion: This pilot study gives an innovative insight into the modes of

  17. Effects of Functional Communication Training (FCT) on the Communicative, Self-Initiated Toileting Behavior for Students with Developmental Disabilities in a School Setting

    Science.gov (United States)

    Kim, Jinnie

    2012-01-01

    Far less is known about the effects of functional communication-based toileting interventions for students with developmental disabilities in a school setting. Furthermore, the currently available toileting interventions for students with disabilities include some undesirable procedures such as the use of punishment, unnatural clinic/university…

  18. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    International Nuclear Information System (INIS)

    SCAIEF, C.C.

    1999-01-01

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring

  19. Assessing the implementation of the family care team in the district health system of health region 2, Thailand

    Directory of Open Access Journals (Sweden)

    Nithra Kitreerawutiwong

    2018-02-01

    Full Text Available Background: The family care team (FCT was established to improve the quality of care. This study aimed to explore the perceptions of FCT implementation and describe the challenges inherent in implementing the FCT. Methods: Forty in-depth interviews were conducted. The interviewees consisted of five primary care managers in the provincial medical health office, five directors of community hospitals, five administrators in district health offices, ten subdistrict health-promoting hospital directors, representatives from ten local organizations, and five heads of village health volunteers. Data were collected in accordance with semistructured interview guidelines and analyzed by thematic analysis. Results: Participants’ expressed their opinions through five themes: (1 the role and scope of practice, (2 the communication in collaboration of the FCT, (3 the management of the FCT, (4 the impact of the FCT on the team members’ feelings and primary care performance, and (5 the main challenges, including the insufficiency of a teamwork culture and a biomedical approach. Conclusion: The information suggests the importance of issues such as the clarification of the team members’ roles and managers’ roles, communication within and across FCTs, and the preparation for training of interprofessionals to enhance collaborative management to achieve the optimal care for people in the district health system.

  20. Computer-Based Cognitive Training for Executive Functions after Stroke: A Systematic Review

    Science.gov (United States)

    van de Ven, Renate M.; Murre, Jaap M. J.; Veltman, Dick J.; Schmand, Ben A.

    2016-01-01

    Background: Stroke commonly results in cognitive impairments in working memory, attention, and executive function, which may be restored with appropriate training programs. Our aim was to systematically review the evidence for computer-based cognitive training of executive dysfunctions. Methods: Studies were included if they concerned adults who had suffered stroke or other types of acquired brain injury, if the intervention was computer training of executive functions, and if the outcome was related to executive functioning. We searched in MEDLINE, PsycINFO, Web of Science, and The Cochrane Library. Study quality was evaluated based on the CONSORT Statement. Treatment effect was evaluated based on differences compared to pre-treatment and/or to a control group. Results: Twenty studies were included. Two were randomized controlled trials that used an active control group. The other studies included multiple baselines, a passive control group, or were uncontrolled. Improvements were observed in tasks similar to the training (near transfer) and in tasks dissimilar to the training (far transfer). However, these effects were not larger in trained than in active control groups. Two studies evaluated neural effects and found changes in both functional and structural connectivity. Most studies suffered from methodological limitations (e.g., lack of an active control group and no adjustment for multiple testing) hampering differentiation of training effects from spontaneous recovery, retest effects, and placebo effects. Conclusions: The positive findings of most studies, including neural changes, warrant continuation of research in this field, but only if its methodological limitations are addressed. PMID:27148007

  1. Computations of zeros of special functions and eigenvalues of differential equations by matrix method

    OpenAIRE

    Miyazaki, Yoshinori

    2000-01-01

    This paper is strongly based on two powerful general theorems proved by Ikebe, et. al in 1993[15] and 1996[13], which will be referred to as Theorem A and Theorem B in this paper. They were recently published and justify the approximate computations of simple eigenvalues of infinite matrices of certain types by truncation, giving an extremely accurate error estimates. So far, they have applied to some important problems in engineering, such as computing the zeros of some special functions, an...

  2. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  3. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  4. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    Science.gov (United States)

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  5. Implementing Prevention Interventions for Non-Communicable ...

    African Journals Online (AJOL)

    UNIBEN

    2Research Fellow, National Primary Health Care Development Agency, Abuja, .... facilities in 2011 (Source: Adapted from FCT Statistical Health Bulletin 2011) ... 2011-2015, FCT Strategic Health Development ..... They also emphasised travel.

  6. Pascal-SC a computer language for scientific computation

    CERN Document Server

    Bohlender, Gerd; von Gudenberg, Jürgen Wolff; Rheinboldt, Werner; Siewiorek, Daniel

    1987-01-01

    Perspectives in Computing, Vol. 17: Pascal-SC: A Computer Language for Scientific Computation focuses on the application of Pascal-SC, a programming language developed as an extension of standard Pascal, in scientific computation. The publication first elaborates on the introduction to Pascal-SC, a review of standard Pascal, and real floating-point arithmetic. Discussions focus on optimal scalar product, standard functions, real expressions, program structure, simple extensions, real floating-point arithmetic, vector and matrix arithmetic, and dynamic arrays. The text then examines functions a

  7. Computer-mediated communication in adults with high-functioning autism spectrum disorders and controls

    NARCIS (Netherlands)

    van der Aa, Christine; Pollmann, Monique; Plaat, Aske; van der Gaag, Rutger Jan

    2016-01-01

    It has been suggested that people with Autism Spectrum Disorders (ASD) are attracted to computer-mediated communication (CMC). In this study, we compare CMC use in adults with high-functioning ASD (N = 113) and a control group (N = 72). We find that people with ASD spend more time on CMC than

  8. Maple (Computer Algebra System) in Teaching Pre-Calculus: Example of Absolute Value Function

    Science.gov (United States)

    Tuluk, Güler

    2014-01-01

    Modules in Computer Algebra Systems (CAS) make Mathematics interesting and easy to understand. The present study focused on the implementation of the algebraic, tabular (numerical), and graphical approaches used for the construction of the concept of absolute value function in teaching mathematical content knowledge along with Maple 9. The study…

  9. Computing wave functions in multichannel collisions with non-local potentials using the R-matrix method

    Science.gov (United States)

    Bonitati, Joey; Slimmer, Ben; Li, Weichuan; Potel, Gregory; Nunes, Filomena

    2017-09-01

    The calculable form of the R-matrix method has been previously shown to be a useful tool in approximately solving the Schrodinger equation in nuclear scattering problems. We use this technique combined with the Gauss quadrature for the Lagrange-mesh method to efficiently solve for the wave functions of projectile nuclei in low energy collisions (1-100 MeV) involving an arbitrary number of channels. We include the local Woods-Saxon potential, the non-local potential of Perey and Buck, a Coulomb potential, and a coupling potential to computationally solve for the wave function of two nuclei at short distances. Object oriented programming is used to increase modularity, and parallel programming techniques are introduced to reduce computation time. We conclude that the R-matrix method is an effective method to predict the wave functions of nuclei in scattering problems involving both multiple channels and non-local potentials. Michigan State University iCER ACRES REU.

  10. Implementation of the Two-Point Angular Correlation Function on a High-Performance Reconfigurable Computer

    Directory of Open Access Journals (Sweden)

    Volodymyr V. Kindratenko

    2009-01-01

    Full Text Available We present a parallel implementation of an algorithm for calculating the two-point angular correlation function as applied in the field of computational cosmology. The algorithm has been specifically developed for a reconfigurable computer. Our implementation utilizes a microprocessor and two reconfigurable processors on a dual-MAP SRC-6 system. The two reconfigurable processors are used as two application-specific co-processors. Two independent computational kernels are simultaneously executed on the reconfigurable processors while data pre-fetching from disk and initial data pre-processing are executed on the microprocessor. The overall end-to-end algorithm execution speedup achieved by this implementation is over 90× as compared to a sequential implementation of the algorithm executed on a single 2.8 GHz Intel Xeon microprocessor.

  11. Management of Liver Cancer Argon-helium Knife Therapy with Functional Computer Tomography Perfusion Imaging.

    Science.gov (United States)

    Wang, Hongbo; Shu, Shengjie; Li, Jinping; Jiang, Huijie

    2016-02-01

    The objective of this study was to observe the change in blood perfusion of liver cancer following argon-helium knife treatment with functional computer tomography perfusion imaging. Twenty-seven patients with primary liver cancer treated with argon-helium knife and were included in this study. Plain computer tomography (CT) and computer tomography perfusion (CTP) imaging were conducted in all patients before and after treatment. Perfusion parameters including blood flows, blood volume, hepatic artery perfusion fraction, hepatic artery perfusion, and hepatic portal venous perfusion were used for evaluating therapeutic effect. All parameters in liver cancer were significantly decreased after argon-helium knife treatment (p knife therapy. Therefore, CTP imaging would play an important role for liver cancer management followed argon-helium knife therapy. © The Author(s) 2014.

  12. FUNCTIONALITY OF STUDENTS WITH PHYSICAL DEFICIENCY IN WRITING AND COMPUTER USE ACTIVITIES

    Directory of Open Access Journals (Sweden)

    Fernanda Matrigani Mercado Gutierres de Queiroz

    2017-08-01

    Full Text Available The educational inclusion is focused on the learning of all students that have confronts barriers in to effective participation in the school life. In the inclusive education perspective, the students with disabilities must meet be served preferably in the regular education and the special education that needs to offer the educational attendance specialized to complement their educational needs. In this context, the objective of the research is defined in: Describe the functionality of students with physical disabilities, in the Multifunctional Resource Rooms, for activities of writing and computer use, according to the perception of the teachers. The participants of this analysis were teachers of the Educational Service Specialist that are serving students with disabilities. For data collection was used instrument School Function Assessment. The data were organized into a single document, the categories being presented. 1Written work; 2Use of the computer and the equipment. The conclusion was that students with physical disabilities, especially those with impaired upper-limb functionality can have find difficult to write using conventional materials, so they need Assistive Technology to develop their writing skills. Therefore, it is important to improve the profile analysis of the student, thus to choose the more appropriate resource as, it is necessary to improve the materials of the Multifunctional Resource Rooms to meet the diversity of all students with physical disabilities, since the type of furniture, didactic-pedagogical materials and equipment, are not favor in the use by students with serious motor disabilities.

  13. The role of dual-energy computed tomography in the assessment of pulmonary function

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Hye Jeon [Department of Radiology, Hallym University College of Medicine, Hallym University Sacred Heart Hospital, 22, Gwanpyeong-ro 170beon-gil, Dongan-gu, Anyang-si, Gyeonggi-do 431-796 (Korea, Republic of); Hoffman, Eric A. [Departments of Radiology, Medicine, and Biomedical Engineering, University of Iowa, 200 Hawkins Dr, CC 701 GH, Iowa City, IA 52241 (United States); Lee, Chang Hyun; Goo, Jin Mo [Department of Radiology, Seoul National University College of Medicine, 103 Daehak-ro, Jongno-gu, Seoul 110-799 (Korea, Republic of); Levin, David L. [Department of Radiology, Mayo Clinic College of Medicine, 200 First Street, SW, Rochester, MN 55905 (United States); Kauczor, Hans-Ulrich [Diagnostic and Interventional Radiology, University Hospital Heidelberg, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Seo, Joon Beom, E-mail: seojb@amc.seoul.kr [Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 388-1, Pungnap 2-dong, Songpa-ku, Seoul, 05505 (Korea, Republic of)

    2017-01-15

    Highlights: • The dual-energy CT technique enables the differentiation of contrast materials with material decomposition algorithm. • Pulmonary functional information can be evaluated using dual-energy CT with anatomic CT information, simultaneously. • Pulmonary functional information from dual-energy CT can improve diagnosis and severity assessment of diseases. - Abstract: The assessment of pulmonary function, including ventilation and perfusion status, is important in addition to the evaluation of structural changes of the lung parenchyma in various pulmonary diseases. The dual-energy computed tomography (DECT) technique can provide the pulmonary functional information and high resolution anatomic information simultaneously. The application of DECT for the evaluation of pulmonary function has been investigated in various pulmonary diseases, such as pulmonary embolism, asthma and chronic obstructive lung disease and so on. In this review article, we will present principles and technical aspects of DECT, along with clinical applications for the assessment pulmonary function in various lung diseases.

  14. Air trapping in sarcoidosis on computed tomography: Correlation with lung function

    International Nuclear Information System (INIS)

    Davies, C.W.H.; Tasker, A.D.; Padley, S.P.G.; Davies, R.J.O.; Gleeson, F.V.

    2000-01-01

    AIMS: To document the presence and extent of air trapping on high resolution computed tomography (HRCT) in patients with pulmonary sarcoidosis and correlate HRCT features with pulmonary function tests. METHODS: Twenty-one patients with pulmonary sarcoidosis underwent HRCT and pulmonary function assessment at presentation. Inspiratory and expiratory HRCT were assessed for the presence and extent of air trapping, ground-glass opacification, nodularity, septal thickening, bronchiectasis and parenchymal distortion. HRCT features were correlated with pulmonary function tests. RESULTS: Air trapping on expiratory HRCT was present in 20/21 (95%) patients. The extent of air trapping correlated with percentage predicted residual volume (RV)/total lung capacity (TLC) (r = 0.499;P < 0.05) and percentage predicted maximal mid-expiratory flow rate between 25 and 75% of the vital capacity (r = -0.54;P < 0.05). Ground-glass opacification was present in four of 21 (19%), nodularity in 18/21 (86%), septal thickening in 18/21 (86%), traction bronchiectasis in 14/21 (67%) and distortion in 12/21 (57%) of patients; there were no significant relationships between these CT features and pulmonary function results. CONCLUSION: Air trapping is a common feature in sarcoidosis and correlates with evidence of small airways disease on pulmonary function testing. Davies, C.W.H. (2000). Clinical Radiology 55, 217-221

  15. Structure of tetragonal martensite in the In95.42Cd4.58 cast alloy

    Science.gov (United States)

    Khlebnikova, Yu. V.; Egorova, L. Yu.; Rodionov, D. P.; Kazantsev, V. A.

    2017-11-01

    The structure of martensite in the In95.42Cd4.58 alloy has been studied by metallography, X-ray diffraction, dilatometry, and transmission electron microscopy. It has been shown that a massive structure built of colonies of tetragonal lamellar plates divided by a twin boundary {101}FCT is formed in the alloy under cooling below the martensite FCC → FCT transition temperature. The alloy recrystallizes after a cycle of FCT → FCC → FCT transitions with a decrease in the grain size by several times compared with the initial structure such fashion that the size of massifs and individual martensite lamella in the massif correlates with the change in the size of the alloy grain. Using thermal cycling, it has been revealed that the alloy tends to stabilize the high-temperature phase.

  16. Parameter study of high-β tokamak reactors with circular and strongly elongated cross section

    International Nuclear Information System (INIS)

    Herold, H.

    1977-05-01

    A simplified reactor model is used to study the influence of critical β-values on economy parameters and dimensions of possible long time pulsed tokamak reactors. Various betas deduced from stability and equilibrium MHD theory are introduced and put into the scaling in context with technological constraints, as maximum B-field, core constraint, maximum wall loading a.o. The plasma physical concepts treated comprise circular and strongly elongated cross section and approximated FCT equilibria. The computational results are presented as plots of possible economy parameter ranges (magnet energy, wall loading, volumina, investment costs per unit power) dependent on β for suitably chosen hierarchies of the constraints. - A burn time reduction by the build ups of α-pressure may be possible for the pressure profile sensitive high-β equilibria (FCT). Burn times in the 1O sec range, resulting from simple estimates, would about cancel the economic advantages of reactors with high-β equilibria compared to a β = 5% standardreactor (UWMAK I). (orig.) [de

  17. Studies on the Zeroes of Bessel Functions and Methods for Their Computation: IV. Inequalities, Estimates, Expansions, etc., for Zeros of Bessel Functions

    Science.gov (United States)

    Kerimov, M. K.

    2018-01-01

    This paper is the fourth in a series of survey articles concerning zeros of Bessel functions and methods for their computation. Various inequalities, estimates, expansions, etc. for positive zeros are analyzed, and some results are described in detail with proofs.

  18. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  19. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  20. The Effect of Neurocognitive Function on Math Computation in Pediatric ADHD: Moderating Influences of Anxious Perfectionism and Gender.

    Science.gov (United States)

    Sturm, Alexandra; Rozenman, Michelle; Piacentini, John C; McGough, James J; Loo, Sandra K; McCracken, James T

    2018-03-20

    Predictors of math achievement in attention-deficit/hyperactivity disorder (ADHD) are not well-known. To address this gap in the literature, we examined individual differences in neurocognitive functioning domains on math computation in a cross-sectional sample of youth with ADHD. Gender and anxiety symptoms were explored as potential moderators. The sample consisted of 281 youth (aged 8-15 years) diagnosed with ADHD. Neurocognitive tasks assessed auditory-verbal working memory, visuospatial working memory, and processing speed. Auditory-verbal working memory speed significantly predicted math computation. A three-way interaction revealed that at low levels of anxious perfectionism, slower processing speed predicted poorer math computation for boys compared to girls. These findings indicate the uniquely predictive values of auditory-verbal working memory and processing speed on math computation, and their differential moderation. These findings provide preliminary support that gender and anxious perfectionism may influence the relationship between neurocognitive functioning and academic achievement.

  1. Computer functions in overall plant control of candu generating stations

    International Nuclear Information System (INIS)

    Chou, Q.B.; Stokes, H.W.

    1976-01-01

    System Planning Specifications form the basic requirements for the performance of the plant including its response to abnormal situations. The rules for the computer control programs are devised from these, taking into account limitations imposed by the reactor, heat transport and turbine-generator systems. The paper outlines these specifications and the limitations imposed by the major items of plant equipment. It describes the functions of each of the main programs, their interactions and the control modes used in the existing Ontario Hydro's nuclear station or proposed for future stations. Some simulation results showing the performance of the overall unit control system and plans for future studies are discussed. (orig.) [de

  2. Nutraceutic effect of free condensed tannins of Lysiloma acapulcensis (Kunth) benth on parasite infection and performance of Pelibuey sheep.

    Science.gov (United States)

    García-Hernández, Cesar; Arece-García, Javier; Rojo-Rubio, Rolando; Mendoza-Martínez, German David; Albarrán-Portillo, Benito; Vázquez-Armijo, José Fernando; Avendaño-Reyes, Leonel; Olmedo-Juárez, Agustín; Marie-Magdeleine, Carine; López-Leyva, Yoel

    2017-01-01

    Forty-five Pelibuey sheep were experimentally infested with nematodes to evaluate the effect of three free condensed tannin (FCT) levels of Lysiloma acapulcensis on fecal egg counts (FECs), packed cell volumes (PCV), ocular mucosa colors (OMC), average daily gain (ADG), and adult nematode count. Five treatments were used: 12.5, 25.0, and 37.5 mg of FCT kg -1 of body weight (BW); sterile water (control); and ivermectine (0.22 mg kg -1 of BW) as chemical group. The data were processed through repeated measurement analysis. Even though the three FCT doses decreased (P < 0.05) the FEC, the highest reduction was obtained with 37.5 mg kg -1 of BW. No differences were observed in PCV and OMC. Higher ADG (P < 0.05) was observed with 37.5 mg kg -1 of BW of FCT. The count of adult nematodes (females and males) in the higher dose of FCT was similar to chemical treatment. Dose of 37.5 mg kg -1 of BW decreased the parasite infection and improved the lamb performance. Therefore, this dose could be used as a nutraceutic product in sheep production.

  3. Imaging local brain function with emission computed tomography

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1984-01-01

    Positron emission tomography (PET) using 18 F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed

  4. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  5. Planning Computer-Aided Distance Learning

    Directory of Open Access Journals (Sweden)

    Nadja Dobnik

    1996-12-01

    Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type­ writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data pro­cessing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.

  6. Stream function method for computing steady rotational transonic flows with application to solar wind-type problems

    International Nuclear Information System (INIS)

    Kopriva, D.A.

    1982-01-01

    A numerical scheme has been developed to solve the quasilinear form of the transonic stream function equation. The method is applied to compute steady two-dimensional axisymmetric solar wind-type problems. A single, perfect, non-dissipative, homentropic and polytropic gas-dynamics is assumed. The four equations governing mass and momentum conservation are reduced to a single nonlinear second order partial differential equation for the stream function. Bernoulli's equation is used to obtain a nonlinear algebraic relation for the density in terms of stream function derivatives. The vorticity includes the effects of azimuthal rotation and Bernoulli's function and is determined from quantities specified on boundaries. The approach is efficient. The number of equations and independent variables has been reduced and a rapid relaxation technique developed for the transonic full potential equation is used. Second order accurate central differences are used in elliptic regions. In hyperbolic regions a dissipation term motivated by the rotated differencing scheme of Jameson is added for stability. A successive-line-overrelaxation technique also introduced by Jameson is used to solve the equations. The nonlinear equation for the density is a double valued function of the stream function derivatives. The velocities are extrapolated from upwind points to determine the proper branch and Newton's method is used to iteratively compute the density. This allows accurate solutions with few grid points

  7. Computational medical imaging and hemodynamics framework for functional analysis and assessment of cardiovascular structures.

    Science.gov (United States)

    Wong, Kelvin K L; Wang, Defeng; Ko, Jacky K L; Mazumdar, Jagannath; Le, Thu-Thao; Ghista, Dhanjoo

    2017-03-21

    Cardiac dysfunction constitutes common cardiovascular health issues in the society, and has been an investigation topic of strong focus by researchers in the medical imaging community. Diagnostic modalities based on echocardiography, magnetic resonance imaging, chest radiography and computed tomography are common techniques that provide cardiovascular structural information to diagnose heart defects. However, functional information of cardiovascular flow, which can in fact be used to support the diagnosis of many cardiovascular diseases with a myriad of hemodynamics performance indicators, remains unexplored to its full potential. Some of these indicators constitute important cardiac functional parameters affecting the cardiovascular abnormalities. With the advancement of computer technology that facilitates high speed computational fluid dynamics, the realization of a support diagnostic platform of hemodynamics quantification and analysis can be achieved. This article reviews the state-of-the-art medical imaging and high fidelity multi-physics computational analyses that together enable reconstruction of cardiovascular structures and hemodynamic flow patterns within them, such as of the left ventricle (LV) and carotid bifurcations. The combined medical imaging and hemodynamic analysis enables us to study the mechanisms of cardiovascular disease-causing dysfunctions, such as how (1) cardiomyopathy causes left ventricular remodeling and loss of contractility leading to heart failure, and (2) modeling of LV construction and simulation of intra-LV hemodynamics can enable us to determine the optimum procedure of surgical ventriculation to restore its contractility and health This combined medical imaging and hemodynamics framework can potentially extend medical knowledge of cardiovascular defects and associated hemodynamic behavior and their surgical restoration, by means of an integrated medical image diagnostics and hemodynamic performance analysis framework.

  8. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Science.gov (United States)

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  9. Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation

    Science.gov (United States)

    Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri

    2017-10-01

    We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.

  10. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x

  11. Blind Quantum Computation

    DEFF Research Database (Denmark)

    Salvail, Louis; Arrighi, Pablo

    2006-01-01

    We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x. The situa......We investigate the possibility of "having someone carry out the work of executing a function for you, but without letting him learn anything about your input". Say Alice wants Bob to compute some known function f upon her input x, but wants to prevent Bob from learning anything about x....... The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation...... protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The cheat-sensitive security achieved relies only upon quantum theory being true. The security analysis carried out assumes the eavesdropper performs individual attacks....

  12. Application of FCT to Incompressible Flows

    National Research Council Canada - National Science Library

    Liu, Junhui; Kaplan, Carolyn R; Mott, David R; Oran, Elaine S

    2006-01-01

    .... Since an odd-even decoupling instability arises in standard algorithms that update a pressure correction in the Poisson equation, we have avoided this instability by using an intermediate velocity...

  13. Computational Modeling and Theoretical Calculations on the Interactions between Spermidine and Functional Monomer (Methacrylic Acid in a Molecularly Imprinted Polymer

    Directory of Open Access Journals (Sweden)

    Yujie Huang

    2015-01-01

    Full Text Available This paper theoretically investigates interactions between a template and functional monomer required for synthesizing an efficient molecularly imprinted polymer (MIP. We employed density functional theory (DFT to compute geometry, single-point energy, and binding energy (ΔE of an MIP system, where spermidine (SPD and methacrylic acid (MAA were selected as template and functional monomer, respectively. The geometry was calculated by using B3LYP method with 6-31+(d basis set. Furthermore, 6-311++(d, p basis set was used to compute the single-point energy of the above geometry. The optimized geometries at different template to functional monomer molar ratios, mode of bonding between template and functional monomer, changes in charge on natural bond orbital (NBO, and binding energy were analyzed. The simulation results show that SPD and MAA form a stable complex via hydrogen bonding. At 1 : 5 SPD to MAA ratio, the binding energy is minimum, while the amount of transferred charge between the molecules is maximum; SPD and MAA form a stable complex at 1 : 5 molar ratio through six hydrogen bonds. Optimizing structure of template-functional monomer complex, through computational modeling prior synthesis, significantly contributes towards choosing a suitable pair of template-functional monomer that yields an efficient MIP with high specificity and selectivity.

  14. VAT: a computational framework to functionally annotate variants in personal genomes within a cloud-computing environment.

    Science.gov (United States)

    Habegger, Lukas; Balasubramanian, Suganthi; Chen, David Z; Khurana, Ekta; Sboner, Andrea; Harmanci, Arif; Rozowsky, Joel; Clarke, Declan; Snyder, Michael; Gerstein, Mark

    2012-09-01

    The functional annotation of variants obtained through sequencing projects is generally assumed to be a simple intersection of genomic coordinates with genomic features. However, complexities arise for several reasons, including the differential effects of a variant on alternatively spliced transcripts, as well as the difficulty in assessing the impact of small insertions/deletions and large structural variants. Taking these factors into consideration, we developed the Variant Annotation Tool (VAT) to functionally annotate variants from multiple personal genomes at the transcript level as well as obtain summary statistics across genes and individuals. VAT also allows visualization of the effects of different variants, integrates allele frequencies and genotype data from the underlying individuals and facilitates comparative analysis between different groups of individuals. VAT can either be run through a command-line interface or as a web application. Finally, in order to enable on-demand access and to minimize unnecessary transfers of large data files, VAT can be run as a virtual machine in a cloud-computing environment. VAT is implemented in C and PHP. The VAT web service, Amazon Machine Image, source code and detailed documentation are available at vat.gersteinlab.org.

  15. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  16. Report on evaluation of research and development of superhigh-function electronic computers; Chokoseino denshi keisanki no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1973-02-20

    Described herein is development of superhigh-function electronic computers.This project was implemented on a 6-year joint project, beginning in FY 1966, by the government, industrial and academic circles, with the objective to develop standard, large-size computers comparable with those of the world's highest functions by the beginning of the 70's. The computers developed by this project met almost all of the specifications of the world's representative, large-size commercial computers, partly surpassing the world's machine. In particular, integration of the virtual memory, buffer memory and multi-processor functions, which were considered to be the central technical features of the computers of the next generation, into one system was a Japan's unique concept, not seen in other countries. The other developments considered to have great ripple effects are seen in LSI's, and techniques for utilizing and mounting them and for improving their reliability. Development of magnetic discs is another notable result for the peripheral devices. Development of the input/output devices was started to correspond to inputting, outputting and reading Chinese characters, which are characteristics of Japan. The software developed has sufficient functions for common use and is considered to be the world's leading, large-size operating system, although evaluation thereof largely awaits the actual specification results. (NEDO)

  17. A BASIC program for an IBM PC compatible computer for drawing the weak phase object contrast transfer function

    International Nuclear Information System (INIS)

    Olsen, A.; Skjerpe, P.

    1989-01-01

    This report describes a computer program which is useful in high resolution microscopy. The program is written in EBASIC and calculates the weak phase object contrast transfer function as function of instrumental and imaging parameters. The function is plotted on the PC graphics screen, and by a Print Screen command the function can be copied to the printer. The program runs on both the Hercules graphic card and the IBM CGA card. 2 figs

  18. RNAdualPF: software to compute the dual partition function with sample applications in molecular evolution theory.

    Science.gov (United States)

    Garcia-Martin, Juan Antonio; Bayegan, Amir H; Dotu, Ivan; Clote, Peter

    2016-10-19

    RNA inverse folding is the problem of finding one or more sequences that fold into a user-specified target structure s 0 , i.e. whose minimum free energy secondary structure is identical to the target s 0 . Here we consider the ensemble of all RNA sequences that have low free energy with respect to a given target s 0 . We introduce the program RNAdualPF, which computes the dual partition function Z ∗ , defined as the sum of Boltzmann factors exp(-E(a,s 0 )/RT) of all RNA nucleotide sequences a compatible with target structure s 0 . Using RNAdualPF, we efficiently sample RNA sequences that approximately fold into s 0 , where additionally the user can specify IUPAC sequence constraints at certain positions, and whether to include dangles (energy terms for stacked, single-stranded nucleotides). Moreover, since we also compute the dual partition function Z ∗ (k) over all sequences having GC-content k, the user can require that all sampled sequences have a precise, specified GC-content. Using Z ∗ , we compute the dual expected energy 〈E ∗ 〉, and use it to show that natural RNAs from the Rfam 12.0 database have higher minimum free energy than expected, thus suggesting that functional RNAs are under evolutionary pressure to be only marginally thermodynamically stable. We show that C. elegans precursor microRNA (pre-miRNA) is significantly non-robust with respect to mutations, by comparing the robustness of each wild type pre-miRNA sequence with 2000 [resp. 500] sequences of the same GC-content generated by RNAdualPF, which approximately [resp. exactly] fold into the wild type target structure. We confirm and strengthen earlier findings that precursor microRNAs and bacterial small noncoding RNAs display plasticity, a measure of structural diversity. We describe RNAdualPF, which rapidly computes the dual partition function Z ∗ and samples sequences having low energy with respect to a target structure, allowing sequence constraints and specified GC

  19. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  20. Brain-Computer Interface Controlled Functional Electrical Stimulation System for Ankle Movement

    Directory of Open Access Journals (Sweden)

    King Christine E

    2011-08-01

    Full Text Available Abstract Background Many neurological conditions, such as stroke, spinal cord injury, and traumatic brain injury, can cause chronic gait function impairment due to foot-drop. Current physiotherapy techniques provide only a limited degree of motor function recovery in these individuals, and therefore novel therapies are needed. Brain-computer interface (BCI is a relatively novel technology with a potential to restore, substitute, or augment lost motor behaviors in patients with neurological injuries. Here, we describe the first successful integration of a noninvasive electroencephalogram (EEG-based BCI with a noninvasive functional electrical stimulation (FES system that enables the direct brain control of foot dorsiflexion in able-bodied individuals. Methods A noninvasive EEG-based BCI system was integrated with a noninvasive FES system for foot dorsiflexion. Subjects underwent computer-cued epochs of repetitive foot dorsiflexion and idling while their EEG signals were recorded and stored for offline analysis. The analysis generated a prediction model that allowed EEG data to be analyzed and classified in real time during online BCI operation. The real-time online performance of the integrated BCI-FES system was tested in a group of five able-bodied subjects who used repetitive foot dorsiflexion to elicit BCI-FES mediated dorsiflexion of the contralateral foot. Results Five able-bodied subjects performed 10 alternations of idling and repetitive foot dorsifiexion to trigger BCI-FES mediated dorsifiexion of the contralateral foot. The epochs of BCI-FES mediated foot dorsifiexion were highly correlated with the epochs of voluntary foot dorsifiexion (correlation coefficient ranged between 0.59 and 0.77 with latencies ranging from 1.4 sec to 3.1 sec. In addition, all subjects achieved a 100% BCI-FES response (no omissions, and one subject had a single false alarm. Conclusions This study suggests that the integration of a noninvasive BCI with a lower

  1. Brain-computer interface controlled functional electrical stimulation system for ankle movement.

    Science.gov (United States)

    Do, An H; Wang, Po T; King, Christine E; Abiri, Ahmad; Nenadic, Zoran

    2011-08-26

    Many neurological conditions, such as stroke, spinal cord injury, and traumatic brain injury, can cause chronic gait function impairment due to foot-drop. Current physiotherapy techniques provide only a limited degree of motor function recovery in these individuals, and therefore novel therapies are needed. Brain-computer interface (BCI) is a relatively novel technology with a potential to restore, substitute, or augment lost motor behaviors in patients with neurological injuries. Here, we describe the first successful integration of a noninvasive electroencephalogram (EEG)-based BCI with a noninvasive functional electrical stimulation (FES) system that enables the direct brain control of foot dorsiflexion in able-bodied individuals. A noninvasive EEG-based BCI system was integrated with a noninvasive FES system for foot dorsiflexion. Subjects underwent computer-cued epochs of repetitive foot dorsiflexion and idling while their EEG signals were recorded and stored for offline analysis. The analysis generated a prediction model that allowed EEG data to be analyzed and classified in real time during online BCI operation. The real-time online performance of the integrated BCI-FES system was tested in a group of five able-bodied subjects who used repetitive foot dorsiflexion to elicit BCI-FES mediated dorsiflexion of the contralateral foot. Five able-bodied subjects performed 10 alternations of idling and repetitive foot dorsifiexion to trigger BCI-FES mediated dorsifiexion of the contralateral foot. The epochs of BCI-FES mediated foot dorsifiexion were highly correlated with the epochs of voluntary foot dorsifiexion (correlation coefficient ranged between 0.59 and 0.77) with latencies ranging from 1.4 sec to 3.1 sec. In addition, all subjects achieved a 100% BCI-FES response (no omissions), and one subject had a single false alarm. This study suggests that the integration of a noninvasive BCI with a lower-extremity FES system is feasible. With additional modifications

  2. Effects of a Computer-Based Intervention Program on the Communicative Functions of Children with Autism

    Science.gov (United States)

    Hetzroni, Orit E.; Tannous, Juman

    2004-01-01

    This study investigated the use of computer-based intervention for enhancing communication functions of children with autism. The software program was developed based on daily life activities in the areas of play, food, and hygiene. The following variables were investigated: delayed echolalia, immediate echolalia, irrelevant speech, relevant…

  3. Exchange anisotropy as a probe of antiferromagnetism in expanded face-centered-tetragonal Mn(001) layers

    NARCIS (Netherlands)

    Kohlhepp, J.T.; Wieldraaijer, H.; Jonge, de W.J.M.

    2006-01-01

    Manganese (Mn) grows coherent and with an expanded metastable face-centered-tetragonal (e-fct) structure on ultrathin fct Co(001)/Cu(001) template layers. From the temp. dependence of the obsd. unidirectional Mn/Co interface exchange anisotropy, an antiferromagnetic state with a blocking temp.

  4. On the predictability of high water level along the US East Coast: can the Florida Current measurement be an indicator for flooding caused by remote forcing?

    Science.gov (United States)

    Ezer, Tal; Atkinson, Larry P.

    2017-06-01

    Recent studies show that in addition to wind and air pressure effects, a significant portion of the variability of coastal sea level (CSL) along the US East Coast can be attributed to non-local factors such as variations in the Gulf Stream and the North Atlantic circulation; these variations can cause unpredictable coastal flooding. The Florida Current transport (FCT) measurement across the Florida Straits monitors those variations, and thus, the study evaluated the potential of using the FCT as an indicator for anomalously high water level along the coast. Hourly water level data from 12 tide gauge stations over 12 years are used to construct records of maximum daily water levels (MDWL) that are compared with the daily FCT data. An empirical mode decomposition (EMD) approach is used to divide the data into high-frequency modes (periods T anti-correlated with MDWL in high-frequency modes but positively correlated with MDWL in low-frequency modes. FCC on the other hand is always anti-correlated with MDWL for all frequency bands, and the high water signal lags behind FCC for almost all stations, thus providing a potential predictive skill (i.e., whenever a weakening trend is detected in the FCT, anomalously high water is expected along the coast over the next few days). The MDWL-FCT correlation in the high-frequency modes is maximum in the lower Mid-Atlantic Bight, suggesting influence from the meandering Gulf Stream after it separates from the coast. However, the correlation in low-frequency modes is maximum in the South Atlantic Bight, suggesting impact from variations in the wind pattern over subtropical regions. The middle-frequency and low-frequency modes of the FCT seem to provide the best predictor for medium to large flooding events; it is estimated that ˜10-25% of the sea level variability in those modes can be attributed to variations in the FCT. An example from Hurricane Joaquin (September-October, 2015) demonstrates how an offshore storm that never made

  5. Logical and physical resource management in the common node of a distributed function laboratory computer network

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  6. Application of computational thermodynamics to the determination of thermophysical properties as a function of temperature for multicomponent Al-based alloys

    International Nuclear Information System (INIS)

    Nascimento, Fabiana C.; Paresque, Mara C.C.; Castro, José A. de; Jácome, Paulo A.D.; Garcia, Amauri; Ferreira, Ivaldo L.

    2015-01-01

    Highlights: • A model coupled to a computational thermodynamics software is proposed to compute thermophysical properties. • The model applies to multicomponent alloys and has been validated against experimental results. • Density and specific heat as a function of temperature are computed for Al–Si–Cu alloys. - Abstract: Despite the technological importance of Al–Si–Cu alloys in manufacturing processes involving heat transfer, such as welding, casting and heat treatment, thermophysical properties of this system of alloys are very scarce in the literature. In this paper, a model connected to a computational thermodynamics software is proposed permitting density and specific heats as a function of temperature and enthalpy of transformations to be numerically determined. The model is pre-validated against experimental density as a function of temperature for liquid and solid phases of A319 and 7075 alloys found in the literature and validated against experimental density values for the solid phase of an Al-6 wt%Cu-1 wt%Si alloy determined in the present study. In both cases the numerical predictions are in good agreement with the experimental results. Specific heat and temperatures and heats of transformation are also numerically determined for this ternary Al-based alloy.

  7. Application of computational thermodynamics to the determination of thermophysical properties as a function of temperature for multicomponent Al-based alloys

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Fabiana C. [Fluminense Federal University, Graduate Program in Metallurgical Engineering, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil); Paresque, Mara C.C. [Fluminense Federal University, Graduate Program in Mechanical Engineering, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil); Castro, José A. de [Fluminense Federal University, Graduate Program in Metallurgical Engineering, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil); Jácome, Paulo A.D. [Fluminense Federal University, Graduate Program in Mechanical Engineering, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil); Garcia, Amauri, E-mail: amaurig@fem.unicamp.br [University of Campinas – UNICAMP, Department of Manufacturing and Materials Engineering, 13083-860 Campinas, SP (Brazil); Ferreira, Ivaldo L. [Fluminense Federal University, Graduate Program in Mechanical Engineering, Av. dos Trabalhadores, 420-27255-125 Volta Redonda, RJ (Brazil)

    2015-11-10

    Highlights: • A model coupled to a computational thermodynamics software is proposed to compute thermophysical properties. • The model applies to multicomponent alloys and has been validated against experimental results. • Density and specific heat as a function of temperature are computed for Al–Si–Cu alloys. - Abstract: Despite the technological importance of Al–Si–Cu alloys in manufacturing processes involving heat transfer, such as welding, casting and heat treatment, thermophysical properties of this system of alloys are very scarce in the literature. In this paper, a model connected to a computational thermodynamics software is proposed permitting density and specific heats as a function of temperature and enthalpy of transformations to be numerically determined. The model is pre-validated against experimental density as a function of temperature for liquid and solid phases of A319 and 7075 alloys found in the literature and validated against experimental density values for the solid phase of an Al-6 wt%Cu-1 wt%Si alloy determined in the present study. In both cases the numerical predictions are in good agreement with the experimental results. Specific heat and temperatures and heats of transformation are also numerically determined for this ternary Al-based alloy.

  8. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    International Nuclear Information System (INIS)

    Roccatano, Danilo

    2015-01-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure–dynamics–function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions. (topical review)

  9. A Dynamic Connectome Supports the Emergence of Stable Computational Function of Neural Circuits through Reward-Based Learning.

    Science.gov (United States)

    Kappel, David; Legenstein, Robert; Habenschuss, Stefan; Hsieh, Michael; Maass, Wolfgang

    2018-01-01

    Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations.

  10. Probing the mutational interplay between primary and promiscuous protein functions: a computational-experimental approach.

    Science.gov (United States)

    Garcia-Seisdedos, Hector; Ibarra-Molero, Beatriz; Sanchez-Ruiz, Jose M

    2012-01-01

    Protein promiscuity is of considerable interest due its role in adaptive metabolic plasticity, its fundamental connection with molecular evolution and also because of its biotechnological applications. Current views on the relation between primary and promiscuous protein activities stem largely from laboratory evolution experiments aimed at increasing promiscuous activity levels. Here, on the other hand, we attempt to assess the main features of the simultaneous modulation of the primary and promiscuous functions during the course of natural evolution. The computational/experimental approach we propose for this task involves the following steps: a function-targeted, statistical coupling analysis of evolutionary data is used to determine a set of positions likely linked to the recruitment of a promiscuous activity for a new function; a combinatorial library of mutations on this set of positions is prepared and screened for both, the primary and the promiscuous activities; a partial-least-squares reconstruction of the full combinatorial space is carried out; finally, an approximation to the Pareto set of variants with optimal primary/promiscuous activities is derived. Application of the approach to the emergence of folding catalysis in thioredoxin scaffolds reveals an unanticipated scenario: diverse patterns of primary/promiscuous activity modulation are possible, including a moderate (but likely significant in a biological context) simultaneous enhancement of both activities. We show that this scenario can be most simply explained on the basis of the conformational diversity hypothesis, although alternative interpretations cannot be ruled out. Overall, the results reported may help clarify the mechanisms of the evolution of new functions. From a different viewpoint, the partial-least-squares-reconstruction/Pareto-set-prediction approach we have introduced provides the computational basis for an efficient directed-evolution protocol aimed at the simultaneous

  11. Cranial computed tomography associated with development of functional dependence in a community-based elderly population

    International Nuclear Information System (INIS)

    Tsukishima, Eri; Shido, Koichi

    2002-01-01

    The purpose of this study was to investigate whether changes at computed tomography (CT) imaging in the ageing brain are associated with future risks for functional dependence. One hundred sixty residents aged 69 years and older at the cranial CT and were independently living in a rural community in Hokkaido, Japan. Cranial CT was performed between 1991 and 1993, graded for ventricular enlargement, sulcal enlargement, white matter change, and small infarction. Functional status was reassessed in 1998 in each participant. Multiple logistic regression analysis was performed to estimate the association of CT changes in the ageing brain with development of functional dependence over six years. Functional dependence was found in 19 residents at the second survey. After adjusting for age, sex, medical conditions, and cognitive functioning, small infarction and ventricular enlargement were significantly associated with development of functional dependence (adjusted odds ratio=9.27 and 4.62). After controlling for age, the age-related changes on cranial CT have significant association on development of functional dependence. (author)

  12. A MATLAB-based graphical user interface program for computing functionals of the geopotential up to ultra-high degrees and orders

    Science.gov (United States)

    Bucha, Blažej; Janák, Juraj

    2013-07-01

    We present a novel graphical user interface program GrafLab (GRAvity Field LABoratory) for spherical harmonic synthesis (SHS) created in MATLAB®. This program allows to comfortably compute 38 various functionals of the geopotential up to ultra-high degrees and orders of spherical harmonic expansion. For the most difficult part of the SHS, namely the evaluation of the fully normalized associated Legendre functions (fnALFs), we used three different approaches according to required maximum degree: (i) the standard forward column method (up to maximum degree 1800, in some cases up to degree 2190); (ii) the modified forward column method combined with Horner's scheme (up to maximum degree 2700); (iii) the extended-range arithmetic (up to an arbitrary maximum degree). For the maximum degree 2190, the SHS with fnALFs evaluated using the extended-range arithmetic approach takes only approximately 2-3 times longer than its standard arithmetic counterpart, i.e. the standard forward column method. In the GrafLab, the functionals of the geopotential can be evaluated on a regular grid or point-wise, while the input coordinates can either be read from a data file or entered manually. For the computation on a regular grid we decided to apply the lumped coefficients approach due to significant time-efficiency of this method. Furthermore, if a full variance-covariances matrix of spherical harmonic coefficients is available, it is possible to compute the commission errors of the functionals. When computing on a regular grid, the output functionals or their commission errors may be depicted on a map using automatically selected cartographic projection.

  13. Principals' Personal Variables and Information and Communication Technology Utilization in Federal Capital Territory Senior Secondary Schools, Abuja, Nigeria

    Science.gov (United States)

    Ogunshola, Roseline Folashade; Adeniyi, Abiodun

    2017-01-01

    The study investigated principals' personal variables and information and communication technology utilization in Federal Capital Territory (FCT) senior secondary schools, Abuja, Nigeria. The study adopted the correlational research design. The study used a sample of 94 senior secondary schools (including public and private) in FCT. Stratified…

  14. An Algorithm Computing the Local $b$ Function by an Approximate Division Algorithm in $\\hat{\\mathcal{D}}$

    OpenAIRE

    Nakayama, Hiromasa

    2006-01-01

    We give an algorithm to compute the local $b$ function. In this algorithm, we use the Mora division algorithm in the ring of differential operators and an approximate division algorithm in the ring of differential operators with power series coefficient.

  15. Morphological and Functional Evaluation of Quadricuspid Aortic Valves Using Cardiac Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Song, Inyoung; Park, Jung Ah; Choi, Bo Hwa; Ko, Sung Min [Department of Radiology, Konkuk University Medical Center, Konkuk University School of Medicine, Seoul 05030 (Korea, Republic of); Shin, Je Kyoun; Chee, Hyun Keun; Kim, Jun Seok [Department of Thoracic Surgery, Konkuk University Medical Center, Konkuk University School of Medicine, Seoul 05030 (Korea, Republic of)

    2016-11-01

    The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV.

  16. Morphological and functional evaluation of quadricuspid aortic valves using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Young; Park, Jung Ah; Choi, Bo Hwa; Ko, Sung Min; Shin, Je Kyoun; Chee, Hyun Keun; KIm, Jun Seok [Konkuk University Medical Center, Konkuk University School of Medicine, Seoul (Korea, Republic of)

    2016-07-15

    The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV.

  17. Gravity-supported exercise with computer gaming improves arm function in chronic stroke.

    Science.gov (United States)

    Jordan, Kimberlee; Sampson, Michael; King, Marcus

    2014-08-01

    To investigate the effect of 4 to 6 weeks of exergaming with a computer mouse embedded within an arm skate on upper limb function in survivors of chronic stroke. Intervention study with a 4-week postintervention follow-up. In home. Survivors (N=13) of chronic (≥6 mo) stroke with hemiparesis of the upper limb with stable baseline Fugl-Meyer assessment scores received the intervention. One participant withdrew, and 2 participants were not reassessed at the 4-week follow-up. No participants withdrew as a result of adverse effects. Four to 6 weeks of exergaming using the arm skate where participants received either 9 (n=5) or 16 (n=7) hours of game play. Upper limb component of the Fugl-Meyer assessment. There was an average increase in the Fugl-Meyer upper limb assessment score from the beginning to end of the intervention of 4.9 points. At the end of the 4-week period after the intervention, the increase was 4.4 points. A 4- to 6-week intervention using the arm skate significantly improved arm function in survivors of chronic stroke by an average of 4.9 Fugl-Meyer upper limb assessment points. This research shows that a larger-scale randomized trial of this device is warranted and highlights the potential value of using virtual reality technology (eg, computer games) in a rehabilitation setting. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Assessment of an extended Nijboer-Zernike approach for the computation of optical point-spread functions.

    Science.gov (United States)

    Braat, Joseph; Dirksen, Peter; Janssen, Augustus J E M

    2002-05-01

    We assess the validity of an extended Nijboer-Zernike approach [J. Opt. Soc. Am. A 19, 849 (2002)], based on ecently found Bessel-series representations of diffraction integrals comprising an arbitrary aberration and a defocus part, for the computation of optical point-spread functions of circular, aberrated optical systems. These new series representations yield a flexible means to compute optical point-spread functions, both accurately and efficiently, under defocus and aberration conditions that seem to cover almost all cases of practical interest. Because of the analytical nature of the formulas, there are no discretization effects limiting the accuracy, as opposed to the more commonly used numerical packages based on strictly numerical integration methods. Instead, we have an easily managed criterion, expressed in the number of terms to be included in the Bessel-series representations, guaranteeing the desired accuracy. For this reason, the analytical method can also serve as a calibration tool for the numerically based methods. The analysis is not limited to pointlike objects but can also be used for extended objects under various illumination conditions. The calculation schemes are simple and permit one to trace the relative strength of the various interfering complex-amplitude terms that contribute to the final image intensity function.

  19. Fast and accurate three-dimensional point spread function computation for fluorescence microscopy.

    Science.gov (United States)

    Li, Jizhou; Xue, Feng; Blu, Thierry

    2017-06-01

    The point spread function (PSF) plays a fundamental role in fluorescence microscopy. A realistic and accurately calculated PSF model can significantly improve the performance in 3D deconvolution microscopy and also the localization accuracy in single-molecule microscopy. In this work, we propose a fast and accurate approximation of the Gibson-Lanni model, which has been shown to represent the PSF suitably under a variety of imaging conditions. We express the Kirchhoff's integral in this model as a linear combination of rescaled Bessel functions, thus providing an integral-free way for the calculation. The explicit approximation error in terms of parameters is given numerically. Experiments demonstrate that the proposed approach results in a significantly smaller computational time compared with current state-of-the-art techniques to achieve the same accuracy. This approach can also be extended to other microscopy PSF models.

  20. A computer vision based candidate for functional balance test.

    Science.gov (United States)

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  1. Neuromorphological and wiring pattern alterations effects on brain function: a mixed experimental and computational approach.

    Directory of Open Access Journals (Sweden)

    Linus Manubens-Gil

    2015-04-01

    In addition, the study of fixed intact brains (by means of the state of the art CLARITY technique brings us closer to biologically and medically relevant situations, allowing not only to confirm whether the functional links in neuronal cultures are also present in vivo, but also enabling the introduction of functional information (like behavioral studies and functional imaging and another layer of structural alterations such as brain region morphology, neuronal density, and long-range connectivity. Taking together the experimental information from these systems we want to feed self-developed computational models that allow us to understand what are the fundamental characteristics of the observed connectivity patterns and the impact of each of the alterations on neuronal network function. These models will also provide a framework able to account for the emergent properties that bridge the gap between spontaneous electrical activity arousal/transmission and higher order information processing and memory storage capacities in the brain. As an additional part of the project we are now working on the application of the clearing, labeling and imaging protocols to human biopsy samples. Our aim is to obtain neuronal architecture and connectivity information from focal cortical dysplasia microcircuits using samples from intractable temporal lobe epilepsy patients that undergo deep-brain electrode recording diagnosis and posterior surgical extraction of the tissue. Our computational models can allow us to discern the contributions of the observed abnormalities to neuronal hyperactivity and epileptic seizure generation.

  2. Modeling of the Kinetics of Supercritical Fluid Extraction of Lipids from Microalgae with Emphasis on Extract Desorption.

    Czech Academy of Sciences Publication Activity Database

    Sovová, Helena; Nobre, B.P.; Palavra, A.

    2016-01-01

    Roč. 9, č. 6 (2016), s. 423-441 ISSN 1996-1944 Grant - others:FCT(PT) UID/QUI/00100/2013; FCT(PT) SFRH/BPD/100283/2014 Institutional support: RVO:67985858 Keywords : microalgae * supercritical extraction * kinetics Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 2.654, year: 2016

  3. Assessment of teacher librarian job satisfaction in the Federal ...

    African Journals Online (AJOL)

    This study assessed job satisfaction of teacher librarians in the Federal Capital Territory (FCT). The entire population of 164 teacher librarians from all secondary schools within the FCT was used. One objective and a hypothesis were formulated to guide this study. They were analysed using percentages represented on ...

  4. Computational modeling of heterogeneity and function of CD4+ T cells

    Directory of Open Access Journals (Sweden)

    Adria eCarbo

    2014-07-01

    Full Text Available The immune system is composed of many different cell types and hundreds of intersecting molecular pathways and signals. This large biological complexity requires coordination between distinct pro-inflammatory and regulatory cell subsets to respond to infection while maintaining tissue homeostasis. CD4+ T cells play a central role in orchestrating immune responses and in maintaining a balance between pro- and anti- inflammatory responses. This tight balance between regulatory and effector reactions depends on the ability of CD4+ T cells to modulate distinct pathways within large molecular networks, since dysregulated CD4+ T cell responses may result in chronic inflammatory and autoimmune diseases. The CD4+ T cell differentiation process comprises an intricate interplay between cytokines, their receptors, adaptor molecules, signaling cascades and transcription factors that help delineate cell fate and function. Computational modeling can help to describe, simulate, analyze, and predict some of the behaviors in this complicated differentiation network. This review provides a comprehensive overview of existing computational immunology methods as well as novel strategies used to model immune responses with a particular focus on CD4+ T cell differentiation.

  5. Use of 4-Dimensional Computed Tomography-Based Ventilation Imaging to Correlate Lung Dose and Function With Clinical Outcomes

    International Nuclear Information System (INIS)

    Vinogradskiy, Yevgeniy; Castillo, Richard; Castillo, Edward; Tucker, Susan L.; Liao, Zhongxing; Guerrero, Thomas; Martel, Mary K.

    2013-01-01

    Purpose: Four-dimensional computed tomography (4DCT)-based ventilation is an emerging imaging modality that can be used in the thoracic treatment planning process. The clinical benefit of using ventilation images in radiation treatment plans remains to be tested. The purpose of the current work was to test the potential benefit of using ventilation in treatment planning by evaluating whether dose to highly ventilated regions of the lung resulted in increased incidence of clinical toxicity. Methods and Materials: Pretreatment 4DCT data were used to compute pretreatment ventilation images for 96 lung cancer patients. Ventilation images were calculated using 4DCT data, deformable image registration, and a density-change based algorithm. Dose–volume and ventilation-based dose function metrics were computed for each patient. The ability of the dose–volume and ventilation-based dose–function metrics to predict for severe (grade 3+) radiation pneumonitis was assessed using logistic regression analysis, area under the curve (AUC) metrics, and bootstrap methods. Results: A specific patient example is presented that demonstrates how incorporating ventilation-based functional information can help separate patients with and without toxicity. The logistic regression significance values were all lower for the dose–function metrics (range P=.093-.250) than for their dose–volume equivalents (range, P=.331-.580). The AUC values were all greater for the dose–function metrics (range, 0.569-0.620) than for their dose–volume equivalents (range, 0.500-0.544). Bootstrap results revealed an improvement in model fit using dose–function metrics compared to dose–volume metrics that approached significance (range, P=.118-.155). Conclusions: To our knowledge, this is the first study that attempts to correlate lung dose and 4DCT ventilation-based function to thoracic toxicity after radiation therapy. Although the results were not significant at the .05 level, our data suggests

  6. Symbolic computation of exact solutions expressible in rational formal hyperbolic and elliptic functions for nonlinear partial differential equations

    International Nuclear Information System (INIS)

    Wang Qi; Chen Yong

    2007-01-01

    With the aid of symbolic computation, some algorithms are presented for the rational expansion methods, which lead to closed-form solutions of nonlinear partial differential equations (PDEs). The new algorithms are given to find exact rational formal polynomial solutions of PDEs in terms of Jacobi elliptic functions, solutions of the Riccati equation and solutions of the generalized Riccati equation. They can be implemented in symbolic computation system Maple. As applications of the methods, we choose some nonlinear PDEs to illustrate the methods. As a result, we not only can successfully obtain the solutions found by most existing Jacobi elliptic function methods and Tanh-methods, but also find other new and more general solutions at the same time

  7. An Improved Fast Compressive Tracking Algorithm Based on Online Random Forest Classifier

    Directory of Open Access Journals (Sweden)

    Xiong Jintao

    2016-01-01

    Full Text Available The fast compressive tracking (FCT algorithm is a simple and efficient algorithm, which is proposed in recent years. But, it is difficult to deal with the factors such as occlusion, appearance changes, pose variation, etc in processing. The reasons are that, Firstly, even if the naive Bayes classifier is fast in training, it is not robust concerning the noise. Secondly, the parameters are required to vary with the unique environment for accurate tracking. In this paper, we propose an improved fast compressive tracking algorithm based on online random forest (FCT-ORF for robust visual tracking. Firstly, we combine ideas with the adaptive compressive sensing theory regarding the weighted random projection to exploit both local and discriminative information of the object. The second reason is the online random forest classifier for online tracking which is demonstrated with more robust to the noise adaptively and high computational efficiency. The experimental results show that the algorithm we have proposed has a better performance in the field of occlusion, appearance changes, and pose variation than the fast compressive tracking algorithm’s contribution.

  8. Effects of concentrate proportion in the diet with or without Fusarium toxin-contaminated triticale on ruminal fermentation and the structural diversity of rumen microbial communities in vitro.

    Science.gov (United States)

    Boguhn, Jeannette; Neumann, Dominik; Helm, André; Strobel, Egbert; Tebbe, Christoph C; Dänicke, Sven; Rodehutscorda, Markus

    2010-12-01

    The objective of this study was to investigate the effects of the concentrate proportion and Fusarium toxin-contaminated triticale (FCT) in the diet on nutrient degradation, microbial protein synthesis and structure of the microbial community, utilising a rumen simulation technique and single-strand conformation polymorphism (SSCP) profiles based on PCR-amplified small subunit ribosomal RNA genes. Four diets containing 60% or 30% concentrates on a dry matter basis with or without FCT were incubated. The fermentation of nutrients and microbial protein synthesis was measured. On the last day of incubation, microbial mass was obtained from the vessel liquid, DNA was extracted and PCR-primers targeting archaea, fibrobacter, clostridia, bifidobacteria, bacillii, fungi, and bacteria were applied to separately study the individual taxonomic groups with SSCP. The concentrate proportion affected the fermentation and the microbial community, but not the efficiency of microbial protein synthesis. Neither the fermentation of organic matter nor the synthesis and composition of microbial protein was affected by FCT. The fermentation of detergent fibre fractions was lower in diets containing FCT compared to diets with uncontaminated triticale. Except for the clostridia group, none of the microbial groups were affected by presence of FCT. In conclusion, our results give no indication that the supplementation of FCT up to a deoxynivalenol concentration in the diet of 5 mg per kg dry matter affects the fermentation of organic matter and microbial protein synthesis. These findings are independent of the concentrate level in the diets. A change in the microbial community composition of the genus Clostridia may be the reason for a reduction in the cellulolytic activity.

  9. Efficient Server-Aided Secure Two-Party Function Evaluation with Applications to Genomic Computation

    Directory of Open Access Journals (Sweden)

    Blanton Marina

    2016-10-01

    Full Text Available Computation based on genomic data is becoming increasingly popular today, be it for medical or other purposes. Non-medical uses of genomic data in a computation often take place in a server-mediated setting where the server offers the ability for joint genomic testing between the users. Undeniably, genomic data is highly sensitive, which in contrast to other biometry types, discloses a plethora of information not only about the data owner, but also about his or her relatives. Thus, there is an urgent need to protect genomic data. This is particularly true when the data is used in computation for what we call recreational non-health-related purposes. Towards this goal, in this work we put forward a framework for server-aided secure two-party computation with the security model motivated by genomic applications. One particular security setting that we treat in this work provides stronger security guarantees with respect to malicious users than the traditional malicious model. In particular, we incorporate certified inputs into secure computation based on garbled circuit evaluation to guarantee that a malicious user is unable to modify her inputs in order to learn unauthorized information about the other user’s data. Our solutions are general in the sense that they can be used to securely evaluate arbitrary functions and offer attractive performance compared to the state of the art. We apply the general constructions to three specific types of genomic tests: paternity, genetic compatibility, and ancestry testing and implement the constructions. The results show that all such private tests can be executed within a matter of seconds or less despite the large size of one’s genomic data.

  10. Development and evaluation of a head-controlled human-computer interface with mouse-like functions for physically disabled users

    Directory of Open Access Journals (Sweden)

    César Augusto Martins Pereira

    2009-01-01

    Full Text Available OBJECTIVES: The objectives of this study were to develop a pointing device controlled by head movement that had the same functions as a conventional mouse and to evaluate the performance of the proposed device when operated by quadriplegic users. METHODS: Ten individuals with cervical spinal cord injury participated in functional evaluations of the developed pointing device. The device consisted of a video camera, computer software, and a target attached to the front part of a cap, which was placed on the user's head. The software captured images of the target coming from the video camera and processed them with the aim of determining the displacement from the center of the target and correlating this with the movement of the computer cursor. Evaluation of the interaction between each user and the proposed device was carried out using 24 multidirectional tests with two degrees of difficulty. RESULTS: According to the parameters of mean throughput and movement time, no statistically significant differences were observed between the repetitions of the tests for either of the studied levels of difficulty. CONCLUSIONS: The developed pointing device adequately emulates the movement functions of the computer cursor. It is easy to use and can be learned quickly when operated by quadriplegic individuals.

  11. Effects of iPod Touch™ Technology as Communication Devices on Peer Social Interactions across Environments

    Science.gov (United States)

    Mancil, G. Richmond; Lorah, Elizabeth R.; Whitby, Peggy Schaefer

    2016-01-01

    The purpose of the study was to evaluate the use of the iPod Touch™ as a Speech Generated Device (SGD) for Functional Communication Training (FCT). The evaluation of the effects on problem behavior, the effects on generalization and maintenance of the acquired communication repertoire, and the social initiations of peers between the new SGD (iPod…

  12. Computer aided design of solonoid magnets

    Energy Technology Data Exchange (ETDEWEB)

    DeOlivares, J.M.

    1978-06-01

    Computer programs utilizing Legendre functions and elliptic integral functions have been written to aid in the design of solenoid magnets. The field inside an axisymmetric magnet can be expanded in a converging power series of Legendre functions. The Legendre function approach is very useful for designing solenoid magnets with a high degree of field uniformity. This approach has been programed on the LBL CDC 7600 computer so that one can design an axisymmetric magnet which meets any desired field structure. Two examples of computer designed solenoids are presented. A computer program utilizing elliptic integral functions was also written for the LBL CDC 7600 computer. This method was used in a computer program to verify the results obtained from the Legendre approach and for field calculations within the conductor. The elliptic integral field calculations within the conductor showed that thin solenoids produce field peaking at the ends of the magnet. Computer data is generated for various magnet geometries and compared with theoretical predictions. Computer results and theoretical prediction both show that field peaking is reduced for longer coils, increased for thinner coils and field peaking is a logarithmic function of length, thickness and radius.

  13. Stress-intensity factors for surface cracks in pipes: a computer code for evaluation by use of influence functions. Final report

    International Nuclear Information System (INIS)

    Dedhia, D.D.; Harris, D.O.

    1982-06-01

    A user-oriented computer program for the evaluation of stress intensity factors for cracks in pipes is presented. Stress intensity factors for semi-elliptical, complete circumferential and long longitudinal cracks can be obtained using this computer program. The code is based on the method of influence functions which makes it possible to treat arbitrary stresses on the plane of the crack. The stresses on the crack plane can be entered as a mathematical or tabulated function. A user's manual is included in this report. Background information is also included

  14. Performance of first-trimester combined test for Down syndrome in different maternal age groups: reason for adjustments in screening policy?

    NARCIS (Netherlands)

    Engels, Melanie A. J.; Heijboer, A. C.; Blankenstein, Marinus A.; van Vugt, John M. G.

    2011-01-01

    To evaluate the performance of the first-trimester combined test (FCT) in different maternal age groups and to discuss whether adjustments in screening policies should be made. In this retrospective study data (n = 26 274) from a fetal medicine center on FCT (maternal age, fetal NT, free β-human

  15. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  16. Fast Computation of Solvation Free Energies with Molecular Density Functional Theory: Thermodynamic-Ensemble Partial Molar Volume Corrections.

    Science.gov (United States)

    Sergiievskyi, Volodymyr P; Jeanmairet, Guillaume; Levesque, Maximilien; Borgis, Daniel

    2014-06-05

    Molecular density functional theory (MDFT) offers an efficient implicit-solvent method to estimate molecule solvation free-energies, whereas conserving a fully molecular representation of the solvent. Even within a second-order approximation for the free-energy functional, the so-called homogeneous reference fluid approximation, we show that the hydration free-energies computed for a data set of 500 organic compounds are of similar quality as those obtained from molecular dynamics free-energy perturbation simulations, with a computer cost reduced by 2-3 orders of magnitude. This requires to introduce the proper partial volume correction to transform the results from the grand canonical to the isobaric-isotherm ensemble that is pertinent to experiments. We show that this correction can be extended to 3D-RISM calculations, giving a sound theoretical justification to empirical partial molar volume corrections that have been proposed recently.

  17. Bessel function expansion to reduce the calculation time and memory usage for cylindrical computer-generated holograms.

    Science.gov (United States)

    Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko

    2017-07-10

    This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.

  18. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  19. Density functional theory based screening of ternary alkali-transition metal borohydrides: A computational material design project

    DEFF Research Database (Denmark)

    Hummelshøj, Jens Strabo; Landis, David; Voss, Johannes

    2009-01-01

    We present a computational screening study of ternary metal borohydrides for reversible hydrogen storage based on density functional theory. We investigate the stability and decomposition of alloys containing 1 alkali metal atom, Li, Na, or K (M1); and 1 alkali, alkaline earth or 3d/4d transition...

  20. A Practical Computational Method for the Anisotropic Redshift-Space 3-Point Correlation Function

    Science.gov (United States)

    Slepian, Zachary; Eisenstein, Daniel J.

    2018-04-01

    We present an algorithm enabling computation of the anisotropic redshift-space galaxy 3-point correlation function (3PCF) scaling as N2, with N the number of galaxies. Our previous work showed how to compute the isotropic 3PCF with this scaling by expanding the radially-binned density field around each galaxy in the survey into spherical harmonics and combining these coefficients to form multipole moments. The N2 scaling occurred because this approach never explicitly required the relative angle between a galaxy pair about the primary galaxy. Here we generalize this work, demonstrating that in the presence of azimuthally-symmetric anisotropy produced by redshift-space distortions (RSD) the 3PCF can be described by two triangle side lengths, two independent total angular momenta, and a spin. This basis for the anisotropic 3PCF allows its computation with negligible additional work over the isotropic 3PCF. We also present the covariance matrix of the anisotropic 3PCF measured in this basis. Our algorithm tracks the full 5-D redshift-space 3PCF, uses an accurate line of sight to each triplet, is exact in angle, and easily handles edge correction. It will enable use of the anisotropic large-scale 3PCF as a probe of RSD in current and upcoming large-scale redshift surveys.

  1. Heuristic lipophilicity potential for computer-aided rational drug design: Optimizations of screening functions and parameters

    Science.gov (United States)

    Du, Qishi; Mezey, Paul G.

    1998-09-01

    In this research we test and compare three possible atom-basedscreening functions used in the heuristic molecular lipophilicity potential(HMLP). Screening function 1 is a power distance-dependent function, b_{{i}} /| {R_{{i}}- r} |^γ, screening function 2is an exponential distance-dependent function, biexp(-| {R_i- r} |/d_0 , and screening function 3 is aweighted distance-dependent function, {{sign}}( {b_i } ){{exp}}ξ ( {| {R_i- r} |/| {b_i } |} )For every screening function, the parameters (γ ,d0, and ξ are optimized using 41 common organic molecules of 4 types of compounds:aliphatic alcohols, aliphatic carboxylic acids, aliphatic amines, andaliphatic alkanes. The results of calculations show that screening function3 cannot give chemically reasonable results, however, both the powerscreening function and the exponential screening function give chemicallysatisfactory results. There are two notable differences between screeningfunctions 1 and 2. First, the exponential screening function has largervalues in the short distance than the power screening function, thereforemore influence from the nearest neighbors is involved using screeningfunction 2 than screening function 1. Second, the power screening functionhas larger values in the long distance than the exponential screeningfunction, therefore screening function 1 is effected by atoms at longdistance more than screening function 2. For screening function 1, thesuitable range of parameter d0 is 1.5 < d0 < 3.0, and d0 = 2.0 is recommended. HMLP developed in this researchprovides a potential tool for computer-aided three-dimensional drugdesign.

  2. Growth and development, nicotine concentrations and sources of nicotine-n in flue-cured tobacco plants influenced by basal n fertilization time and n fertilizer (15N)

    International Nuclear Information System (INIS)

    Xie Zhijian; Tu Shuxin; Li Jinping; Xu Rubing; Chen Zhenguo; Cao Shiming; Wang Xuelong

    2010-01-01

    A field experiment with 15 N isotope tracing micro-plots was carried out to study the effects of basal N fertilizer application time (15 d, 30 d before the transplanting) of flue-cured tobacco (FCT) seedlings and nitrogen fertilization (with N and without N) on growth and development, nicotine concentrations and sources of nicotine N of FCT in Laowan (N 31 degree 27', E 111 degree 14', 1 130 m above sea level), a main tobacco production area of Xiangfan city, Hubei province. The results showed that both dry matter accumulation and nicotine concentrations of different parts of FCT increased with growing of plants. The concentrations of nicotine decreased with the ascending of leaf position before topping period, but just opposite after the removal of apex. The proportion of nicotine N from fertilizer to total nicotine N decreased with growing of FCT plants and the rising of leaf position. Applying N fertilizer significantly increased dry matter accumulation of shoot and the nicotine concentrations of different poisional tobacco leaves by 2.1-2.7 fold and 0.1-0.7 fold respectively. Compared with the basal fertilization time 15 d before transplanting, applying basal fertilizer 30 d before transplanting increased the dry matter accumulation and nicotine concentrations of flue-cured tobacco by 2.2%-8.0% and 6.3%-18.5% respectively. There was no significant effects of basal N fertilization time on the proportion of nicotine-N from fertilizer in organs of FCT plants at mature stage. These results suggested that properly putting forward the basal N fertilization time before transplanting make for decrease of nicotine concentrations and improvement of quality of FCT leaves, so as to improve its industrial utilities. (authors)

  3. Phase transformation and magnetic anisotropy of an iron-palladium ferromagnetic shape-memory alloy

    International Nuclear Information System (INIS)

    Cui, J.; Shield, T.W.; James, R.D.

    2004-01-01

    Martensitic phase transformations in an Fe 7 Pd 3 alloy were studied using various experimental techniques: visual observation, differential scanning calorimeter (DSC) measurements and X-ray diffraction. Magnetic measurements on this alloy were made using a vibrating sample magnetometer (VSM) and a Susceptibility Kappa bridge. The VSM measurements were made with the sample in a compression fixture to bias the martensite phase to a single variant. Both X-ray and DSC measurements show that the FCC-FCT transformation is a weak first-order thermoelastic transition. The average lattice parameters are a=3.822±0.001 A and c=3.630±0.001 A for the FCT martensite, and a 0 =3.756±0.001 A for the FCC austenite. The latent heat of the FCC-FCT transformation is 10.79±0.01 J/cm 3 . A Susceptibility Kappa bridge measurement determined the Curie temperature to be 450 deg. C. The saturation magnetization from VSM data is m s =1220±10 emu/cm 3 at -20 deg. C for the martensite and m s =1080±10 emu/cm 3 at 60 deg. C for the austenite. The easy axes of a single variant of FCT martensite are the [1 0 0] and [0 1 0] directions (the a-axes of the FCT lattice) and the [0 0 1] direction (FCT c-axis) is the hard direction. The cubic magnetic anisotropy constant K 1 is -5±2x10 3 erg/cm 3 for the austenite at 60 deg. C, and the tetragonal anisotropy constant K 1 +K 2 is 3.41 ± 0.02 x 10 5 erg/cm 3 for the martensite at a temperature of -20 deg. C and under 8 MPa of compressive stress in the [0 0 1] direction

  4. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering

  5. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  6. Pathways to Commercial Success: Technologies and Products Supported by the Fuel Cell Technologies Program

    Energy Technology Data Exchange (ETDEWEB)

    Weakley, Steven A.; Brown, Scott A.

    2011-09-29

    The purpose of the project described in this report is to identify and document the commercial and emerging (projected to be commercialized within the next 3 years) hydrogen and fuel cell technologies and products that resulted from Department of Energy support through the Fuel Cell Technologies (FCT) Program in the Office of Energy Efficiency and Renewable Energy (EERE). To do this, Pacific Northwest National Laboratory (PNNL) undertook two efforts simultaneously to accomplish this project. The first effort was a patent search and analysis to identify hydrogen- and fuel-cell-related patents that are associated with FCT-funded projects (or projects conducted by DOE-EERE predecessor programs) and to ascertain the patents current status, as well as any commercial products that may have used the technology documented in the patent. The second effort was a series of interviews with current and past FCT personnel, a review of relevant program annual reports, and an examination of hydrogen- and fuel-cell-related grants made under the Small Business Innovation Research and Small Business Technology Transfer Programs, and within the FCT portfolio.

  7. Computing the Partition Function for Kinetically Trapped RNA Secondary Structures

    Science.gov (United States)

    Lorenz, William A.; Clote, Peter

    2011-01-01

    An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in time and space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1) the number of locally optimal structures is far fewer than the total number of structures – indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2) the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3) the (modified) maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model) can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected accuracy. Web server

  8. Computing the partition function for kinetically trapped RNA secondary structures.

    Directory of Open Access Journals (Sweden)

    William A Lorenz

    Full Text Available An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in O(n3 time and O(n2 space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1 the number of locally optimal structures is far fewer than the total number of structures--indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2 the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3 the (modified maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected

  9. The Impact of Computer Use on Learning of Quadratic Functions

    Science.gov (United States)

    Pihlap, Sirje

    2017-01-01

    Studies of the impact of various types of computer use on the results of learning and student motivation have indicated that the use of computers can increase learning motivation, and that computers can have a positive effect, a negative effect, or no effect at all on learning outcomes. Some results indicate that it is not computer use itself that…

  10. Contributing to Functionality

    DEFF Research Database (Denmark)

    Törpel, Bettina

    2006-01-01

    The objective of this paper is the design of computer supported joint action spaces. It is argued against a view of functionality as residing in computer applications. In such a view the creation of functionality is equivalent to the creation of computer applications. Functionality, in the view...... advocated in this paper, emerges in the specific dynamic interplay of actors, objectives, structures, practices and means. In this view, functionality is the result of creating, harnessing and inhabiting computer supported joint action spaces. The successful creation and further development of a computer...... supported joint action space comprises a whole range of appropriate design contributions. The approach is illustrated by the example of the creation of the computer supported joint action space "exchange network of voluntary union educators". As part of the effort a group of participants created...

  11. BrEPS: a flexible and automatic protocol to compute enzyme-specific sequence profiles for functional annotation

    Directory of Open Access Journals (Sweden)

    Schomburg D

    2010-12-01

    Full Text Available Abstract Background Models for the simulation of metabolic networks require the accurate prediction of enzyme function. Based on a genomic sequence, enzymatic functions of gene products are today mainly predicted by sequence database searching and operon analysis. Other methods can support these techniques: We have developed an automatic method "BrEPS" that creates highly specific sequence patterns for the functional annotation of enzymes. Results The enzymes in the UniprotKB are identified and their sequences compared against each other with BLAST. The enzymes are then clustered into a number of trees, where each tree node is associated with a set of EC-numbers. The enzyme sequences in the tree nodes are aligned with ClustalW. The conserved columns of the resulting multiple alignments are used to construct sequence patterns. In the last step, we verify the quality of the patterns by computing their specificity. Patterns with low specificity are omitted and recomputed further down in the tree. The final high-quality patterns can be used for functional annotation. We ran our protocol on a recent Swiss-Prot release and show statistics, as well as a comparison to PRIAM, a probabilistic method that is also specialized on the functional annotation of enzymes. We determine the amount of true positive annotations for five common microorganisms with data from BRENDA and AMENDA serving as standard of truth. BrEPS is almost on par with PRIAM, a fact which we discuss in the context of five manually investigated cases. Conclusions Our protocol computes highly specific sequence patterns that can be used to support the functional annotation of enzymes. The main advantages of our method are that it is automatic and unsupervised, and quite fast once the patterns are evaluated. The results show that BrEPS can be a valuable addition to the reconstruction of metabolic networks.

  12. Computation of a numerically satisfactory pair of solutions of the differential equation for conical functions of non-negative integer orders

    NARCIS (Netherlands)

    T.M. Dunster (Mark); A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2014-01-01

    textabstractWe consider the problem of computing satisfactory pair of solutions of the differential equation for Legendre functions of non-negative integer order $\\mu$ and degree $-\\frac12+i\\tau$, where $\\tau$ is a non-negative real parameter. Solutions of this equation are the conical functions

  13. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    Science.gov (United States)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  14. Deformable image registration for geometrical evaluation of DIBH radiotherapy treatment of lung cancer patients

    DEFF Research Database (Denmark)

    Ottosson, Wiviann; Lykkegaard Andersen, J. A.; Borrisova, S.

    2014-01-01

    locally advanced non-small cell lung cancer patients were included, each with a planning-, midterm- and final CT (pCT, mCT, fCT) and 7 CBCTs acquired weekly and on the same day as the mCT and fCT. All imaging were performed in both FB and DIBH, using Varian RPM system for respiratory tracking...

  15. Nuclear Fuel Cycle Technologies: Current Challenges and Future Plans - 12558

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Andrew [U.S. Department of Energy, Washington, DC (United States)

    2012-07-01

    The mission of the Office of Nuclear Energy's Fuel Cycle Technologies office (FCT program) is to provide options for possible future changes in national nuclear energy programs. While the recent draft report of the Blue Ribbon Commission on America's Nuclear Future stressed the need for organization changes, interim waste storage and the establishment of a permanent repository for nuclear waste management, it also recognized the potential value of alternate fuel cycles and recommended continued research and development in that area. With constrained budgets and great expectations, the current challenges are significant. The FCT program now performs R and D covering the entire fuel cycle. This broad R and D scope is a result of the assignment of new research and development (R and D) responsibilities to the Office of Nuclear Energy (NE), as well as reorganization within NE. This scope includes uranium extraction from seawater and uranium enrichment R and D, used nuclear fuel recycling technology, advanced fuel development, and a fresh look at a range of disposal geologies. Additionally, the FCT program performs the necessary systems analysis and screening of fuel cycle alternatives that will identify the most promising approaches and areas of technology gaps. Finally, the FCT program is responsible for a focused effort to consider features of fuel cycle technology in a way that promotes nonproliferation and security, such as Safeguards and Security by Design, and advanced monitoring and predictive modeling capabilities. This paper and presentation will provide an overview of the FCT program R and D scope and discuss plans to analyze fuel cycle options and support identified R and D priorities into the future. The FCT program is making progress in implanting a science based, engineering driven research and development program that is evaluating options for a sustainable fuel cycle in the U.S. Responding to the BRC recommendations, any resulting legislative

  16. Automated Quantitative Computed Tomography Versus Visual Computed Tomography Scoring in Idiopathic Pulmonary Fibrosis: Validation Against Pulmonary Function.

    Science.gov (United States)

    Jacob, Joseph; Bartholmai, Brian J; Rajagopalan, Srinivasan; Kokosi, Maria; Nair, Arjun; Karwoski, Ronald; Raghunath, Sushravya M; Walsh, Simon L F; Wells, Athol U; Hansell, David M

    2016-09-01

    The aim of the study was to determine whether a novel computed tomography (CT) postprocessing software technique (CALIPER) is superior to visual CT scoring as judged by functional correlations in idiopathic pulmonary fibrosis (IPF). A total of 283 consecutive patients with IPF had CT parenchymal patterns evaluated quantitatively with CALIPER and by visual scoring. These 2 techniques were evaluated against: forced expiratory volume in 1 second (FEV1), forced vital capacity (FVC), diffusing capacity for carbon monoxide (DLco), carbon monoxide transfer coefficient (Kco), and a composite physiological index (CPI), with regard to extent of interstitial lung disease (ILD), extent of emphysema, and pulmonary vascular abnormalities. CALIPER-derived estimates of ILD extent demonstrated stronger univariate correlations than visual scores for most pulmonary function tests (PFTs): (FEV1: CALIPER R=0.29, visual R=0.18; FVC: CALIPER R=0.41, visual R=0.27; DLco: CALIPER R=0.31, visual R=0.35; CPI: CALIPER R=0.48, visual R=0.44). Correlations between CT measures of emphysema extent and PFTs were weak and did not differ significantly between CALIPER and visual scoring. Intriguingly, the pulmonary vessel volume provided similar correlations to total ILD extent scored by CALIPER for FVC, DLco, and CPI (FVC: R=0.45; DLco: R=0.34; CPI: R=0.53). CALIPER was superior to visual scoring as validated by functional correlations with PFTs. The pulmonary vessel volume, a novel CALIPER CT parameter with no visual scoring equivalent, has the potential to be a CT feature in the assessment of patients with IPF and requires further exploration.

  17. A Karaoke System with Real-Time Media Merging and Sharing Functions for a Cloud-Computing-Integrated Mobile Device

    Directory of Open Access Journals (Sweden)

    Her-Tyan Yeh

    2013-01-01

    Full Text Available Mobile devices such as personal digital assistants (PDAs, smartphones, and tablets have increased in popularity and are extremely efficient for work-related, social, and entertainment uses. Popular entertainment services have also attracted substantial attention. Thus, relevant industries have exerted considerable efforts in establishing a method by which mobile devices can be used to develop excellent and convenient entertainment services. Because cloud-computing technology is mature and possesses a strong computing processing capacity, integrating this technology into the entertainment service function in mobile devices can reduce the data load on a system and maintain mobile device performances. This study combines cloud computing with a mobile device to design a karaoke system that contains real-time media merging and sharing functions. This system enables users to download music videos (MVs from their mobile device and sing and record their singing by using the device. They can upload the recorded song to the cloud server where it is merged with real-time media. Subsequently, by employing a media streaming technology, users can store their personal MVs in their mobile device or computer and instantaneously share these videos with others on the Internet. Through this process, people can instantly watch shared videos, enjoy the leisure and entertainment effects of mobile devices, and satisfy their desire for singing.

  18. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  19. Exact fast computation of band depth for large functional datasets: How quickly can one million curves be ranked?

    KAUST Repository

    Sun, Ying; Genton, Marc G.; Nychka, Douglas W.

    2012-01-01

    © 2012 John Wiley & Sons, Ltd. Band depth is an important nonparametric measure that generalizes order statistics and makes univariate methods based on order statistics possible for functional data. However, the computational burden of band depth

  20. The Use of Computer-Assisted Home Exercises to Preserve Physical Function after a Vestibular Rehabilitation Program: A Randomized Controlled Study

    Directory of Open Access Journals (Sweden)

    Michael Smaerup

    2016-01-01

    Full Text Available Objective. The purpose of this study was to evaluate whether elderly patients with vestibular dysfunction are able to preserve physical functional level, reduction in dizziness, and the patient’s quality of life when assistive computer technology is used in comparison with printed instructions. Materials and Methods. Single-blind, randomized, controlled follow-up study. Fifty-seven elderly patients with chronic dizziness were randomly assigned to a computer-assisted home exercise program or to home exercises as described in printed instructions and followed for tree month after discharge from an outpatient clinic. Results. Both groups had maintained their high functional levels three months after finishing the outpatient rehabilitation. No statistically significant difference was found in outcome scores between the two groups. In spite of moderate compliance levels, the patients maintained their high functional level indicating that the elderly should not necessarily exercise for the first three months after termination of the training in the outpatient clinic. Conclusion. Elderly vestibular dysfunction patients exercising at home seem to maintain their functional level, level of dizziness, and quality of life three months following discharge from hospital. In this specific setup, no greater effect was found by introducing a computer-assisted training program, when compared to standard home training guided by printed instructions. This trial is registered with NCT01344408.

  1. Computer network for electric power control systems. Chubu denryoku (kabu) denryoku keito seigyoyo computer network

    Energy Technology Data Exchange (ETDEWEB)

    Tsuneizumi, T. (Chubu Electric Power Co. Inc., Nagoya (Japan)); Shimomura, S.; Miyamura, N. (Fuji Electric Co. Ltd., Tokyo (Japan))

    1992-06-03

    A computer network for electric power control system was developed that is applied with the open systems interconnection (OSI), an international standard for communications protocol. In structuring the OSI network, a direct session layer was accessed from the operation functions when high-speed small-capacity information is transmitted. File transfer, access and control having a function of collectively transferring large-capacity data were applied when low-speed large-capacity information is transmitted. A verification test for the realtime computer network (RCN) mounting regulation was conducted according to a verification model using a mini-computer, and a result that can satisfy practical performance was obtained. For application interface, kernel, health check and two-route transmission functions were provided as a connection control function, so were transmission verification function and late arrival abolishing function. In system mounting pattern, dualized communication server (CS) structure was adopted. A hardware structure may include a system to have the CS function contained in a host computer and a separate installation system. 5 figs., 6 tabs.

  2. Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging.

    Science.gov (United States)

    Ohta, Shinri; Fukui, Naoki; Sakai, Kuniyoshi L

    2013-01-01

    The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties.

  3. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  4. When can Empirical Green Functions be computed from Noise Cross-Correlations? Hints from different Geographical and Tectonic environments

    Science.gov (United States)

    Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando

    2014-05-01

    Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of

  5. Computer based training for NPP personnel (interactive communication systems and functional trainers)

    International Nuclear Information System (INIS)

    Martin, H.D.

    1987-01-01

    KWU as a manufacturer of thermal and nuclear power plants has extensive customer training obligations within its power plant contracts. In this respect KWU has gained large experience in training of personnel, in the production of training material including video tapes an in the design of simulators. KWU developed interactive communication systems (ICS) for training and retraining purposes with a personal computer operating a video disc player on which video instruction is stored. The training program is edited with the help of a self developed editing system which enables the author to easily enter his instructions into the computer. ICS enables the plant management to better monitor the performance of its personnel through computerized training results and helps to save training manpower. German NPPs differ very much from other designs with respect to a more complex and integrated reactor control system and an additional reactor limitation system. Simulators for such plants therefore have also to simulate these systems. KWU developed a Functional Trainer (FT) which is a replica of the primary system, the auxiliary systems linked to it and the associated control, limitation and protection systems including the influences of the turbine operation and control

  6. Functional Automata - Formal Languages for Computer Science Students

    Directory of Open Access Journals (Sweden)

    Marco T. Morazán

    2014-12-01

    Full Text Available An introductory formal languages course exposes advanced undergraduate and early graduate students to automata theory, grammars, constructive proofs, computability, and decidability. Programming students find these topics to be challenging or, in many cases, overwhelming and on the fringe of Computer Science. The existence of this perception is not completely absurd since students are asked to design and prove correct machines and grammars without being able to experiment nor get immediate feedback, which is essential in a learning context. This article puts forth the thesis that the theory of computation ought to be taught using tools for actually building computations. It describes the implementation and the classroom use of a library, FSM, designed to provide students with the opportunity to experiment and test their designs using state machines, grammars, and regular expressions. Students are able to perform random testing before proceeding with a formal proof of correctness. That is, students can test their designs much like they do in a programming course. In addition, the library easily allows students to implement the algorithms they develop as part of the constructive proofs they write. Providing students with this ability ought to be a new trend in the formal languages classroom.

  7. A brain-computer interface to support functional recovery

    DEFF Research Database (Denmark)

    Kjaer, Troels W; Sørensen, Helge Bjarup Dissing

    2013-01-01

    Brain-computer interfaces (BCI) register changes in brain activity and utilize this to control computers. The most widely used method is based on registration of electrical signals from the cerebral cortex using extracranially placed electrodes also called electroencephalography (EEG). The features...... extracted from the EEG may, besides controlling the computer, also be fed back to the patient for instance as visual input. This facilitates a learning process. BCI allow us to utilize brain activity in the rehabilitation of patients after stroke. The activity of the cerebral cortex varies with the type...... of movement we imagine, and by letting the patient know the type of brain activity best associated with the intended movement the rehabilitation process may be faster and more efficient. The focus of BCI utilization in medicine has changed in recent years. While we previously focused on devices facilitating...

  8. Assessment of left ventricular function by electrocardiogram-gated myocardial single photon emission computed tomography using quantitative gated single photon emission computed tomography software

    International Nuclear Information System (INIS)

    Morita, Koichi; Adachi, Itaru; Konno, Masanori

    1999-01-01

    Electrocardiogram (ECG)-gated myocardial single photon emission computed tomography (SPECT) can assess left ventricular (LV) perfusion and function easily using quantitative gated SPECT (QGS) software. ECG-gated SPECT was performed in 44 patients with coronary artery disease under post-stress and resting conditions to assess the values of LV functional parameters, by comparison to LV ejection fraction derived from gated blood pool scan and myocardial characteristics. A good correlation was obtained between ejection fraction using QGS and that using cardiac blood pool scan (r=0.812). Some patients with myocardial ischemia had lower ejection fraction under post-stress compared to resting conditions, indicating post-stress LV dysfunction. LV wall motion and wall thickening were significantly impaired in ischemic and infarcted myocardium, and the degree of abnormality in the infarcted areas was greater than in the ischemia area. LV functional parameters derived using QGS were useful to assess post-stress LV dysfunction and myocardial viability. In conclusion, ECG-gated myocardial SPECT permits simultaneous quantitative assessment of myocardial perfusion and function. (author)

  9. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    International Nuclear Information System (INIS)

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  10. High beam current shut-off systems in the APS linac and low energy transfer line

    International Nuclear Information System (INIS)

    Wang, X.; Knott, M.; Lumpkin, A.

    1994-01-01

    Two independent high beam current shut-off current monitoring systems (BESOCM) have been installed in the APS linac and the low energy transport line to provide personnel safety protection in the event of acceleration of excessive beam currents. Beam current is monitored by a fast current transformer (FCT) and fully redundant supervisory circuits connected to the Access Control Interlock System (ACIS) for beam intensity related shutdowns of the linac. One FCT is located at the end of the positron linac and the other in the low energy transport line, which directs beam to the positron accumulator ring (PAR). To ensure a high degree of reliability, both systems employ a continuous self-checking function, which injects a test pulse to a single-turn test winding after each ''real'' beam pulse to verify that the system is fully functional. The system is designed to be fail-safe for all possible system faults, such as loss of power, open or shorted signal or test cables, loss of external trigger, malfunction of gated integrator, etc. The system has been successfully commissioned and is now a reliable part of the total ACIS

  11. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  12. Can Expanded Bacteriochlorins Act as Photosensitizers in Photodynamic Therapy? Good News from Density Functional Theory Computations

    Directory of Open Access Journals (Sweden)

    Gloria Mazzone

    2016-02-01

    Full Text Available The main photophysical properties of a series of expanded bacteriochlorins, recently synthetized, have been investigated by means of DFT and TD-DFT methods. Absorption spectra computed with different exchange-correlation functionals, B3LYP, M06 and ωB97XD, have been compared with the experimental ones. In good agreement, all the considered systems show a maximum absorption wavelength that falls in the therapeutic window (600–800 nm. The obtained singlet-triplet energy gaps are large enough to ensure the production of cytotoxic singlet molecular oxygen. The computed spin-orbit matrix elements suggest a good probability of intersystem spin-crossing between singlet and triplet excited states, since they result to be higher than those computed for 5,10,15,20-tetrakis-(m-hydroxyphenylchlorin (Foscan© already used in the photodynamic therapy (PDT protocol. Because of the investigated properties, these expanded bacteriochlorins can be proposed as PDT agents.

  13. Using computer graphics to preserve function in resection of malignant melanoma of the foot.

    Science.gov (United States)

    Kaufman, M; Vantuyl, A; Japour, C; Ghosh, B C

    2001-08-01

    The increasing incidence of malignant melanoma challenges physicians to find innovative ways to preserve function and appearance in affected areas that require partial resection. We carefully planned the resection of a malignant lesion between the third and fourth toes of a 77-year-old man with the aid of computer technology. The subsequent excision of the third, fourth, and fifth digits was executed such that the new metatarsal arc formed would approximate the dimensions of the optimal hyperbola, thereby minimizing gait disturbance.

  14. A partition function approximation using elementary symmetric functions.

    Directory of Open Access Journals (Sweden)

    Ramu Anandakrishnan

    Full Text Available In statistical mechanics, the canonical partition function [Formula: see text] can be used to compute equilibrium properties of a physical system. Calculating [Formula: see text] however, is in general computationally intractable, since the computation scales exponentially with the number of particles [Formula: see text] in the system. A commonly used method for approximating equilibrium properties, is the Monte Carlo (MC method. For some problems the MC method converges slowly, requiring a very large number of MC steps. For such problems the computational cost of the Monte Carlo method can be prohibitive. Presented here is a deterministic algorithm - the direct interaction algorithm (DIA - for approximating the canonical partition function [Formula: see text] in [Formula: see text] operations. The DIA approximates the partition function as a combinatorial sum of products known as elementary symmetric functions (ESFs, which can be computed in [Formula: see text] operations. The DIA was used to compute equilibrium properties for the isotropic 2D Ising model, and the accuracy of the DIA was compared to that of the basic Metropolis Monte Carlo method. Our results show that the DIA may be a practical alternative for some problems where the Monte Carlo method converge slowly, and computational speed is a critical constraint, such as for very large systems or web-based applications.

  15. Computer Modelling of Functional Aspects of Noise in Endogenously Oscillating Neurons

    Science.gov (United States)

    Huber, M. T.; Dewald, M.; Voigt, K.; Braun, H. A.; Moss, F.

    1998-03-01

    Membrane potential oscillations are a widespread feature of neuronal activity. When such oscillations operate close to the spike-triggering threshold, noise can become an essential property of spike-generation. According to that, we developed a minimal Hodgkin-Huxley-type computer model which includes a noise term. This model accounts for experimental data from quite different cells ranging from mammalian cortical neurons to fish electroreceptors. With slight modifications of the parameters, the model's behavior can be tuned to bursting activity, which additionally allows it to mimick temperature encoding in peripheral cold receptors including transitions to apparently chaotic dynamics as indicated by methods for the detection of unstable periodic orbits. Under all conditions, cooperative effects between noise and nonlinear dynamics can be shown which, beyond stochastic resonance, might be of functional significance for stimulus encoding and neuromodulation.

  16. The Impact of Xamthohumol on a Brewing Yeast’s Viability, Vitality and Metabolite Formation

    Czech Academy of Sciences Publication Activity Database

    Magalhaes, P.H.; Carvalho, A.B.; Goncalves, L.M.; Pacheco, J.G.; Guido, L.F.; Brányik, T.; Rodrigues, P.G.; Kuncová, Gabriela; Dostálek, P.; Barros, A.A.

    2011-01-01

    Roč. 117, č. 3 (2011), s. 368-376 ISSN 0046-9750 Grant - others:FCT(PT) PJM:SFRH/BD/27834/2006; FCT(PT) LMG:SFRH/BD/36791/2007; FCT(PT) JGP:SFRH/BD/30279/2006 Institutional research plan: CEZ:AV0Z40720504 Keywords : beer * xanthohumol * yeast Subject RIV: GM - Food Processing Impact factor: 0.660, year: 2011 http://www.scopus.com/record/display.url?eid=2-s2.0-80755180700&origin=resultslist&sort=plf-f&src=s&st1=kuncova%2cg&sid=nWEtd4PqxEYtj7bOsbEcl0M%3a60&sot=b&sdt=b&sl=22&s=AUTHOR-NAME%28kuncova%2cg%29&relpos=0&relpos=0&searchTerm=AUTHOR-NAME(kuncova,g)

  17. Failure detection in high-performance clusters and computers using chaotic map computations

    Science.gov (United States)

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  18. Complete Fairness in Secure Two-Party Computation

    DEFF Research Database (Denmark)

    Gordon, S. Dov; Hazay, Carmit; Katz, Jonathan

    2011-01-01

    In the setting of secure two-party computation, two mutually distrusting parties wish to compute some function of their inputs while preserving, to the extent possible, various security properties such as privacy, correctness, and more. One desirable property is fairness which guarantees, informa...... for such functions must have round complexity super-logarithmic in the security parameter. Our results demonstrate that the question of completely fair secure computation without an honest majority is far from closed.......In the setting of secure two-party computation, two mutually distrusting parties wish to compute some function of their inputs while preserving, to the extent possible, various security properties such as privacy, correctness, and more. One desirable property is fairness which guarantees......-party setting. We demonstrate that this folklore belief is false by showing completely fair protocols for various nontrivial functions in the two-party setting based on standard cryptographic assumptions. We first show feasibility of obtaining complete fairness when computing any function over polynomial...

  19. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  20. FIT: Computer Program that Interactively Determines Polynomial Equations for Data which are a Function of Two Independent Variables

    Science.gov (United States)

    Arbuckle, P. D.; Sliwa, S. M.; Roy, M. L.; Tiffany, S. H.

    1985-01-01

    A computer program for interactively developing least-squares polynomial equations to fit user-supplied data is described. The program is characterized by the ability to compute the polynomial equations of a surface fit through data that are a function of two independent variables. The program utilizes the Langley Research Center graphics packages to display polynomial equation curves and data points, facilitating a qualitative evaluation of the effectiveness of the fit. An explanation of the fundamental principles and features of the program, as well as sample input and corresponding output, are included.

  1. In situ x-ray reflectivity and grazing incidence x-ray diffraction study of L 1{sub 0} ordering in {sup 57}Fe/Pt multilayers

    Energy Technology Data Exchange (ETDEWEB)

    Raghavendra Reddy, V; Gupta, Ajay; Gome, Anil [UGC-DAE Consortium for Scientific Research, University Campus, Khandwa Road, Indore-452 017 (India); Leitenberger, Wolfram [Institute of Physics, University of Potsdam, 14469 Potsdam (Germany); Pietsch, U [Physics Department, University of Siegen, D-57068 Siegen (Germany)], E-mail: vrreddy@csr.ernet.in, E-mail: varimalla@yahoo.com

    2009-05-06

    In situ high temperature x-ray reflectivity and grazing incidence x-ray diffraction measurements in the energy dispersive mode are used to study the ordered face-centered tetragonal (fct) L 1{sub 0} phase formation in [Fe(19 A)/Pt(25 A)]{sub x 10} multilayers prepared by ion beam sputtering. With the in situ x-ray measurements it is observed that (i) the multilayer structure first transforms to a disordered FePt and subsequently to an ordered fct L 1{sub 0} phase, (ii) the ordered fct L 1{sub 0} FePt peaks start to appear at 320 deg. C annealing, (iii) the activation energy of the interdiffusion is 0.8 eV and (iv) ordered fct FePt grains have preferential out-of-plane texture. The magneto-optical Kerr effect and conversion electron Moessbauer spectroscopies are used to study the magnetic properties of the as-deposited and 400 deg. C annealed multilayers. The magnetic data for the 400 {sup 0}C annealed sample indicate that the magnetization is at an angle of {approx}50 deg. from the plane of the film.

  2. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  3. A comparative approach for the investigation of biological information processing: An examination of the structure and function of computer hard drives and DNA

    Science.gov (United States)

    2010-01-01

    Background The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. Methods Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1) orthogonal uniqueness, (2) low level formatting, (3) high level formatting and (4) translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. Results Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT) during high level formatting of the computer hard drive and the subsequent loading of an operating system (OS). Biological

  4. A comparative approach for the investigation of biological information processing: an examination of the structure and function of computer hard drives and DNA.

    Science.gov (United States)

    D'Onofrio, David J; An, Gary

    2010-01-21

    The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1) orthogonal uniqueness, (2) low level formatting, (3) high level formatting and (4) translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT) during high level formatting of the computer hard drive and the subsequent loading of an operating system (OS). Biological systems do not have an

  5. A comparative approach for the investigation of biological information processing: An examination of the structure and function of computer hard drives and DNA

    Directory of Open Access Journals (Sweden)

    D'Onofrio David J

    2010-01-01

    Full Text Available Abstract Background The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. Methods Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1 orthogonal uniqueness, (2 low level formatting, (3 high level formatting and (4 translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. Results Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT during high level formatting of the computer hard drive and the subsequent loading of an operating

  6. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  7. Computational-based structural, functional and phylogenetic analysis of Enterobacter phytases.

    Science.gov (United States)

    Pramanik, Krishnendu; Kundu, Shreyasi; Banerjee, Sandipan; Ghosh, Pallab Kumar; Maiti, Tushar Kanti

    2018-06-01

    Myo-inositol hexakisphosphate phosphohydrolases (i.e., phytases) are known to be a very important enzyme responsible for solubilization of insoluble phosphates. In the present study, Enterobacter phytases have characterized by different phylogenetic, structural and functional parameters using some standard bio-computational tools. Results showed that majority of the Enterobacter phytases are acidic in nature as most of the isoelectric points were under 7.0. The aliphatic indices predicted for the selected proteins were below 40 indicating their thermostable nature. The average molecular weight of the proteins was 48 kDa. The lower values of GRAVY of the said proteins implied that they have better interactions with water. Secondary structure prediction revealed that alpha-helical content was highest among the other forms such as sheets, coils, etc. Moreover, the predicted 3D structure of Enterobacter phytases divulged that the proteins consisted of four monomeric polypeptide chains i.e., it was a tetrameric protein. The predicted tertiary model of E. aerogenes (A0A0M3HCJ2) was deposited in Protein Model Database (Acc. No.: PM0080561) for further utilization after a thorough quality check from QMEAN and SAVES server. Functional analysis supported their classification as histidine acid phosphatases. Besides, multiple sequence alignment revealed that "DG-DP-LG" was the most highly conserved residues within the Enterobacter phytases. Thus, the present study will be useful in selecting suitable phytase-producing microbe exclusively for using in the animal food industry as a food additive.

  8. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    Science.gov (United States)

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  9. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  10. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  11. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  12. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  13. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    Science.gov (United States)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  14. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  15. Computational screening of functionalized zinc porphyrins for dye sensitized solar cells

    DEFF Research Database (Denmark)

    Ørnsø, Kristian Baruël; García Lastra, Juan Maria; Thygesen, Kristian Sommer

    2013-01-01

    separation, and high output voltage. Here we demonstrate an extensive computational screening of zinc porphyrins functionalized with electron donating side groups and electron accepting anchoring groups. The trends in frontier energy levels versus side groups are analyzed and a no-loss DSSC level alignment...... quality is estimated. Out of the initial 1029 molecules, we find around 50 candidates with level alignment qualities within 5% of the optimal limit. We show that the level alignment of five zinc porphyrin dyes which were recently used in DSSCs with high efficiencies can be further improved by simple side......An efficient dye sensitized solar cell (DSSC) is one possible solution to meet the world's rapidly increasing energy demands and associated climate challenges. This requires inexpensive and stable dyes with well-positioned frontier energy levels for maximal solar absorption, efficient charge...

  16. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  17. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  18. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  19. Pervasive Computing and Prosopopoietic Modelling

    DEFF Research Database (Denmark)

    Michelsen, Anders Ib

    2011-01-01

    the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one......This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT...... that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...

  20. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  1. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  2. Cat Swarm Optimization Based Functional Link Artificial Neural Network Filter for Gaussian Noise Removal from Computed Tomography Images

    Directory of Open Access Journals (Sweden)

    M. Kumar

    2016-01-01

    Full Text Available Gaussian noise is one of the dominant noises, which degrades the quality of acquired Computed Tomography (CT image data. It creates difficulties in pathological identification or diagnosis of any disease. Gaussian noise elimination is desirable to improve the clarity of a CT image for clinical, diagnostic, and postprocessing applications. This paper proposes an evolutionary nonlinear adaptive filter approach, using Cat Swarm Functional Link Artificial Neural Network (CS-FLANN to remove the unwanted noise. The structure of the proposed filter is based on the Functional Link Artificial Neural Network (FLANN and the Cat Swarm Optimization (CSO is utilized for the selection of optimum weight of the neural network filter. The applied filter has been compared with the existing linear filters, like the mean filter and the adaptive Wiener filter. The performance indices, such as peak signal to noise ratio (PSNR, have been computed for the quantitative analysis of the proposed filter. The experimental evaluation established the superiority of the proposed filtering technique over existing methods.

  3. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    Science.gov (United States)

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  4. Computer-aided proofs for multiparty computation with active security

    DEFF Research Database (Denmark)

    Haagh, Helene; Karbyshev, Aleksandr; Oechsner, Sabine

    2018-01-01

    Secure multi-party computation (MPC) is a general cryptographic technique that allows distrusting parties to compute a function of their individual inputs, while only revealing the output of the function. It has found applications in areas such as auctioning, email filtering, and secure...... teleconference. Given its importance, it is crucial that the protocols are specified and implemented correctly. In the programming language community it has become good practice to use computer proof assistants to verify correctness proofs. In the field of cryptography, EasyCrypt is the state of the art proof...... public-key encryption, signatures, garbled circuits and differential privacy. Here we show for the first time that it can also be used to prove security of MPC against a malicious adversary. We formalize additive and replicated secret sharing schemes and apply them to Maurer's MPC protocol for secure...

  5. Computing for calculus

    CERN Document Server

    Christensen, Mark J

    1981-01-01

    Computing for Calculus focuses on BASIC as the computer language used for solving calculus problems.This book discusses the input statement for numeric variables, advanced intrinsic functions, numerical estimation of limits, and linear approximations and tangents. The elementary estimation of areas, numerical and string arrays, line drawing algorithms, and bisection and secant method are also elaborated. This text likewise covers the implicit functions and differentiation, upper and lower rectangular estimates, Simpson's rule and parabolic approximation, and interpolating polynomials. Other to

  6. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  7. Replacement of the JRR-3 computer system

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Tomoaki; Kobayashi, Kenichi; Suwa, Masayuki; Mineshima, Hiromi; Sato, Mitsugu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2000-10-01

    The JRR-3 computer system contributes to stable operation of JRR-3 since 1990. But now about 10 years have passed since it was designed and some problems have occurred. Under these situations, we should replace the old computer system for safe and stable operation. In this replacement, the system is improved as regards man-machine interface and efficiency about maintenance. The new system consists of three functions, which are 'the function of management for operation information' (renewal function), 'the function of management for facility information' (new function) and the function of management for information publication' (new function). By this replacement, New JRR-3 computer system can contribute to safe and stable operation. (author)

  8. Replacement of the JRR-3 computer system

    International Nuclear Information System (INIS)

    Kato, Tomoaki; Kobayashi, Kenichi; Suwa, Masayuki; Mineshima, Hiromi; Sato, Mitsugu

    2000-01-01

    The JRR-3 computer system contributes to stable operation of JRR-3 since 1990. But now about 10 years have passed since it was designed and some problems have occurred. Under these situations, we should replace the old computer system for safe and stable operation. In this replacement, the system is improved as regards man-machine interface and efficiency about maintenance. The new system consists of three functions, which are 'the function of management for operation information' (renewal function), 'the function of management for facility information' (new function) and the function of management for information publication' (new function). By this replacement, New JRR-3 computer system can contribute to safe and stable operation. (author)

  9. Studies on the zeros of Bessel functions and methods for their computation: 2. Monotonicity, convexity, concavity, and other properties

    Science.gov (United States)

    Kerimov, M. K.

    2016-07-01

    This work continues the study of real zeros of first- and second-kind Bessel functions and Bessel general functions with real variables and orders begun in the first part of this paper (see M.K. Kerimov, Comput. Math. Math. Phys. 54 (9), 1337-1388 (2014)). Some new results concerning such zeros are described and analyzed. Special attention is given to the monotonicity, convexity, and concavity of zeros with respect to their ranks and other parameters.

  10. A brain-computer interface to support functional recovery.

    Science.gov (United States)

    Kjaer, Troels W; Sørensen, Helge B

    2013-01-01

    Brain-computer interfaces (BCI) register changes in brain activity and utilize this to control computers. The most widely used method is based on registration of electrical signals from the cerebral cortex using extracranially placed electrodes also called electroencephalography (EEG). The features extracted from the EEG may, besides controlling the computer, also be fed back to the patient for instance as visual input. This facilitates a learning process. BCI allow us to utilize brain activity in the rehabilitation of patients after stroke. The activity of the cerebral cortex varies with the type of movement we imagine, and by letting the patient know the type of brain activity best associated with the intended movement the rehabilitation process may be faster and more efficient. The focus of BCI utilization in medicine has changed in recent years. While we previously focused on devices facilitating communication in the rather few patients with locked-in syndrome, much interest is now devoted to the therapeutic use of BCI in rehabilitation. For this latter group of patients, the device is not intended to be a lifelong assistive companion but rather a 'teacher' during the rehabilitation period. Copyright © 2013 S. Karger AG, Basel.

  11. Computer versus Compensatory Calendar Training in Individuals with Mild Cognitive Impairment: Functional Impact in a Pilot Study.

    Science.gov (United States)

    Chandler, Melanie J; Locke, Dona E C; Duncan, Noah L; Hanna, Sherrie M; Cuc, Andrea V; Fields, Julie A; Hoffman Snyder, Charlene R; Lunde, Angela M; Smith, Glenn E

    2017-09-06

    This pilot study examined the functional impact of computerized versus compensatory calendar training in cognitive rehabilitation participants with mild cognitive impairment (MCI). Fifty-seven participants with amnestic MCI completed randomly assigned calendar or computer training. A standard care control group was used for comparison. Measures of adherence, memory-based activities of daily living (mADLs), and self-efficacy were completed. The calendar training group demonstrated significant improvement in mADLs compared to controls, while the computer training group did not. Calendar training may be more effective in improving mADLs than computerized intervention. However, this study highlights how behavioral trials with fewer than 30-50 participants per arm are likely underpowered, resulting in seemingly null findings.

  12. Quantum Chemistry on Quantum Computers: A Polynomial-Time Quantum Algorithm for Constructing the Wave Functions of Open-Shell Molecules.

    Science.gov (United States)

    Sugisaki, Kenji; Yamamoto, Satoru; Nakazawa, Shigeaki; Toyota, Kazuo; Sato, Kazunobu; Shiomi, Daisuke; Takui, Takeji

    2016-08-18

    Quantum computers are capable to efficiently perform full configuration interaction (FCI) calculations of atoms and molecules by using the quantum phase estimation (QPE) algorithm. Because the success probability of the QPE depends on the overlap between approximate and exact wave functions, efficient methods to prepare accurate initial guess wave functions enough to have sufficiently large overlap with the exact ones are highly desired. Here, we propose a quantum algorithm to construct the wave function consisting of one configuration state function, which is suitable for the initial guess wave function in QPE-based FCI calculations of open-shell molecules, based on the addition theorem of angular momentum. The proposed quantum algorithm enables us to prepare the wave function consisting of an exponential number of Slater determinants only by a polynomial number of quantum operations.

  13. Proceedings of the 1993 Particle Accelerator Conference Held in Washington, DC on May 17-20, 1993. Volume 3

    Science.gov (United States)

    1993-05-20

    t I-I -- I.- I-I- I~ -. p -. B,_• - ae’ s 1 -" I al... - - II I i I- I - I I Ia-. • Figure 4. Trasnport line power supply control interface IV...Ch2 IOOmVIT M IO8ns Glitch Chl Figure 8. Beam response of the FCT, upper tarce is the response of FCT which is located at upstream of the trasnport line

  14. Comparison of measured and computed phase functions of individual tropospheric ice crystals

    International Nuclear Information System (INIS)

    Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin

    2016-01-01

    Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert. - Highlights: • A GO code for shaped beams and non-spherical particles has been developed. • The code has been validated against exact Mie results. • Measured and computed phase functions for a single ice crystal have been compared. • The comparison highlights differences in the backscattering region.

  15. Specific features of vocal fold paralysis in functional computed tomography

    International Nuclear Information System (INIS)

    Laskowska, K.; Mackiewicz-Nartowicz, H.; Serafin, Z.; Nawrocka, E.

    2008-01-01

    Vocal fold paralysis is usually recognized in laryngological examination, and detailed vocal fold function may be established based on laryngovideostroboscopy. Additional imaging should exclude any morphological causes of the paresis, which should be treated pharmacologically or surgically. The aim of this paper was to analyze the computed tomography (CT) images of the larynx in patients with unilateral vocal fold paralysis. CT examinations of the larynx were performed in 10 patients with clinically defined unilateral vocal fold paralysis. The examinations consisted of unenhanced acquisition and enhanced 3-phased acquisition: during free breathing, Valsalva maneuver, and phonation. The analysis included the following morphologic features of the paresis.the deepened epiglottic vallecula, the deepened piriform recess, the thickened and medially positioned aryepiglottic fold, the widened laryngeal pouch, the anteriorly positioned arytenoid cartilage, the thickened vocal fold, and the filled infraglottic space in frontal CT reconstruction. CT images were compared to laryngovideostroboscopy. The most common symptoms of vocal cord paralysis in CT were the deepened epiglottic vallecula and piriform recess, the widened laryngeal pouch with the filled infraglottic space, and the thickened aryepiglottic fold. Regarding the efficiency of the paralysis determination, the three functional techniques of CT larynx imaging used did not differ significantly, and laryngovideostroboscopy demonstrated its advantage over CT. CT of the larynx is a supplementary examination in the diagnosis of vocal fold paralysis, which may enable topographic analysis of the fold dysfunction. The knowledge of morphological CT features of the paralysis may help to prevent false-positive diagnosis of laryngeal cancer. (author)

  16. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  17. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level......., by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete...

  18. Functional safeguards for computers for protection systems for Savannah River reactors

    International Nuclear Information System (INIS)

    Kritz, W.R.

    1977-06-01

    Reactors at the Savannah River Plant have recently been equipped with a ''safety computer'' system. This system utilizes dual digital computers in a primary protection system that monitors individual fuel assembly coolant flow and temperature. The design basis for the (SRP safety) computer systems allowed for eventual failure of any input sensor or any computer component. These systems are routinely used by reactor operators with a minimum of training in computer technology. The hardware configuration and software design therefore contain safeguards so that both hardware and human failures do not cause significant loss of reactor protection. The performance of the system to date is described

  19. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  20. Generalised Computability and Applications to Hybrid Systems

    DEFF Research Database (Denmark)

    Korovina, Margarita V.; Kudinov, Oleg V.

    2001-01-01

    We investigate the concept of generalised computability of operators and functionals defined on the set of continuous functions, firstly introduced in [9]. By working in the reals, with equality and without equality, we study properties of generalised computable operators and functionals. Also we...... propose an interesting application to formalisation of hybrid systems. We obtain some class of hybrid systems, which trajectories are computable in the sense of computable analysis. This research was supported in part by the RFBR (grants N 99-01-00485, N 00-01- 00810) and by the Siberian Branch of RAS (a...... grant for young researchers, 2000)...

  1. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    International Nuclear Information System (INIS)

    Snyder, Abigail C.; Jiao, Yu

    2010-01-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.

  2. Resummed coefficient function for the shape function

    OpenAIRE

    Aglietti, U.

    2001-01-01

    We present a leading evaluation of the resummed coefficient function for the shape function. It is also shown that the coefficient function is short-distance-dominated. Our results allow relating the shape function computed on the lattice to the physical QCD distributions.

  3. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  4. Comparison of mineral weathering and biomass nutrient uptake in two small forested watersheds underlain by quartzite bedrock, Catoctin Mountain, Maryland, USA

    Science.gov (United States)

    Rice, Karen; Price, Jason R.

    2014-01-01

    To quantify chemical weathering and biological uptake, mass-balance calculations were performed on two small forested watersheds located in the Blue Ridge Physiographic Province in north-central Maryland, USA. Both watersheds, Bear Branch (BB) and Fishing Creek Tributary (FCT), are underlain by relatively unreactive quartzite bedrock. Such unreactive bedrock and associated low chemical-weathering rates offer the opportunity to quantify biological processes operating within the watershed. Hydrologic and stream-water chemistry data were collected from the two watersheds for the 9-year period from June 1, 1990 to May 31, 1999. Of the two watersheds, FCT exhibited both higher chemical-weathering rates and biomass nutrient uptake rates, suggesting that forest biomass aggradation was limited by the rate of chemical weathering of the bedrock. Although the chemical-weathering rate in the FCT watershed was low relative to the global average, it masked the influence of biomass base-cation uptake on stream-water chemistry. Any differences in bedrock mineralogy between the two watersheds did not exert a significant influence on the overall weathering stoichiometry. The difference in chemical-weathering rates between the two watersheds is best explained by a larger proportion of reactive phyllitic layers within the bedrock of the FCT watershed. Although the stream gradient of BB is about two-times greater than that of FCT, its influence on chemical weathering appears to be negligible. The findings of this study support the biomass nutrient uptake stoichiometry of K1.0Mg1.1Ca0.97 previously determined for the study site. Investigations of the chemical weathering of relatively unreactive quartzite bedrock may provide insight into critical zone processes.

  5. Computation of the modified Bessel function of the third kind of imaginary orders: uniform Airy-type asymptotic expansion

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2002-01-01

    textabstractThe use of a uniform Airy-type asymptotic expansion for the computation of the modified Bessel functions of the third kind of imaginary orders ($K_{ia}(x)$) near the transition point $x=a$, is discussed. In [2], an algorithm for the evaluation of $K_{ia}(x)$ was presented, which made use

  6. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  7. First results with twisted mass fermions towards the computation of parton distribution functions on the lattice

    International Nuclear Information System (INIS)

    Alexandrou, Constantia; Cyprus Institute, Nicosia; Deutsches Elektronen-Synchrotron; Cichy, Krzysztof; Poznan Univ.; Drach, Vincent; Garcia-Ramos, Elena; Humboldt-Universitaet, Berlin; Hadjiyiannakou, Kyriakos; Jansen, Karl; Steffens, Fernanda; Wiese, Christian

    2014-11-01

    We report on our exploratory study for the evaluation of the parton distribution functions from lattice QCD, based on a new method proposed in Ref.∝arXiv:1305.1539. Using the example of the nucleon, we compare two different methods to compute the matrix elements needed, and investigate the application of gauge link smearing. We also present first results from a large production ensemble and discuss the future challenges related to this method.

  8. Fuel Cycle Technologies 2014 Achievement Report

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Bonnie C. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-01-01

    The Fuel Cycle Technologies (FCT) program supports the Department of Energy’s (DOE’s) mission to: “Enhance U.S. security and economic growth through transformative science, technology innovation, and market solutions to meet our energy, nuclear security, and environmental challenges.” Goal 1 of DOE’s Strategic Plan is to innovate energy technologies that enhance U.S. economic growth and job creation, energy security, and environmental quality. FCT does this by investing in advanced technologies that could transform the nuclear fuel cycle in the decades to come. Goal 2 of DOE’s Strategic Plan is to strengthen national security by strengthening key science, technology, and engineering capabilities. FCT does this by working closely with the National Nuclear Security Administration and the U.S Department of State to develop advanced technologies that support the Nation’s nuclear nonproliferation goals.

  9. Computer program for Bessel and Hankel functions

    Science.gov (United States)

    Kreider, Kevin L.; Saule, Arthur V.; Rice, Edward J.; Clark, Bruce J.

    1991-01-01

    A set of FORTRAN subroutines for calculating Bessel and Hankel functions is presented. The routines calculate Bessel and Hankel functions of the first and second kinds, as well as their derivatives, for wide ranges of integer order and real or complex argument in single or double precision. Depending on the order and argument, one of three evaluation methods is used: the power series definition, an Airy function expansion, or an asymptotic expansion. Routines to calculate Airy functions and their derivatives are also included.

  10. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  11. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    International Nuclear Information System (INIS)

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  12. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  13. RXY/DRXY-a postprocessing graphical system for scientific computation

    International Nuclear Information System (INIS)

    Jin Qijie

    1990-01-01

    Scientific computing require computer graphical function for its visualization. The developing objects and functions of a postprocessing graphical system for scientific computation are described, and also briefly described its implementation

  14. Functional high-resolution computed tomography of pulmonary vascular and airway reactions

    International Nuclear Information System (INIS)

    Herold, C.J.; Johns Hopkins Medical Institutions, Baltimore, MD; Brown, R.H.; Johns Hopkins Medical Institutions, Baltimore, MD; Johns Hopkins Medical Institutions, Baltimore, MD; Wetzel, R.C.; Herold, S.M.; Zeerhouni, E.A.

    1993-01-01

    We describe the use of high-resolution computed tomography (HRCT) for assessment of the function of pulmonary vessels and airways. With its excellent spatial resolution, HRCT is able to demonstrate pulmonary structures as small as 300 μm and can be used to monitor changes following various stimuli. HRCT also provides information about structures smaller than 300 μm through measurement of parenchymal background density. To date, sequential, spiral and ultrafast HRCT techniques have been used in a variety of challenges to gather information about the anatomical correlates of traditional physiological measurements, thus making anatomical-physiological correlation possible. HRCT of bronchial reactivity can demonstrate the location and time course of aerosol-induced broncho-constriction and may show changes not apparent on spirometry. HRCT of the pulmonary vascular system visualized adaptations of vessels during hypoxia and intravascular volume loading and elucidates cardiorespiratory interactions. Experimental studies provide a basis for potential clinical applications of this method. (orig.) [de

  15. Contrast computation methods for interferometric measurement of sensor modulation transfer function

    Science.gov (United States)

    Battula, Tharun; Georgiev, Todor; Gille, Jennifer; Goma, Sergio

    2018-01-01

    Accurate measurement of image-sensor frequency response over a wide range of spatial frequencies is very important for analyzing pixel array characteristics, such as modulation transfer function (MTF), crosstalk, and active pixel shape. Such analysis is especially significant in computational photography for the purposes of deconvolution, multi-image superresolution, and improved light-field capture. We use a lensless interferometric setup that produces high-quality fringes for measuring MTF over a wide range of frequencies (here, 37 to 434 line pairs per mm). We discuss the theoretical framework, involving Michelson and Fourier contrast measurement of the MTF, addressing phase alignment problems using a moiré pattern. We solidify the definition of Fourier contrast mathematically and compare it to Michelson contrast. Our interferometric measurement method shows high detail in the MTF, especially at high frequencies (above Nyquist frequency). We are able to estimate active pixel size and pixel pitch from measurements. We compare both simulation and experimental MTF results to a lens-free slanted-edge implementation using commercial software.

  16. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  17. Functional needs which led to the use of digital computing devices in the protection system of 1300 MW units

    International Nuclear Information System (INIS)

    Dalle, H.

    1986-01-01

    After a review of classical protection functions used in 900 MW power plants, it is concluded that in order to have functioning margins it is useful to calculate more finely the controled parameters. These calculating needs lead to the use of digital computing devices. Drawing profit from the new possibilities one can improve the general performances of the protection system with regard to availability, safety and maintenance. These options in the case of PALUEL led to the realization of SPIN, described here

  18. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  19. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  20. Efficient Multi-Party Computation over Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Fehr, Serge; Ishai, Yuval

    2003-01-01

    Secure multi-party computation (MPC) is an active research area, and a wide range of literature can be found nowadays suggesting improvements and generalizations of existing protocols in various directions. However, all current techniques for secure MPC apply to functions that are represented by ...... the usefulness of the above results by presenting a novel application of MPC over (non-field) rings to the round-efficient secure computation of the maximum function. Basic Research in Computer Science (www.brics.dk), funded by the Danish National Research Foundation.......Secure multi-party computation (MPC) is an active research area, and a wide range of literature can be found nowadays suggesting improvements and generalizations of existing protocols in various directions. However, all current techniques for secure MPC apply to functions that are represented...... by (boolean or arithmetic) circuits over finite fields. We are motivated by two limitations of these techniques: – Generality. Existing protocols do not apply to computation over more general algebraic structures (except via a brute-force simulation of computation in these structures). – Efficiency. The best...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  2. The relationship between lung function impairment and quantitative computed tomography in chronic obstructive pulmonary disease

    International Nuclear Information System (INIS)

    Mets, O.M.; Murphy, K.; Zanen, P.; Lammers, J.W.; Gietema, H.A.; Jong, P.A. de; Ginneken, B. van; Prokop, M.

    2012-01-01

    To determine the relationship between lung function impairment and quantitative computed tomography (CT) measurements of air trapping and emphysema in a population of current and former heavy smokers with and without airflow limitation. In 248 subjects (50 normal smokers; 50 mild obstruction; 50 moderate obstruction; 50 severe obstruction; 48 very severe obstruction) CT emphysema and CT air trapping were quantified on paired inspiratory and end-expiratory CT examinations using several available quantification methods. CT measurements were related to lung function (FEV 1 , FEV 1 /FVC, RV/TLC, Kco) by univariate and multivariate linear regression analysis. Quantitative CT measurements of emphysema and air trapping were strongly correlated to airflow limitation (univariate r-squared up to 0.72, p < 0.001). In multivariate analysis, the combination of CT emphysema and CT air trapping explained 68-83% of the variability in airflow limitation in subjects covering the total range of airflow limitation (p < 0.001). The combination of quantitative CT air trapping and emphysema measurements is strongly associated with lung function impairment in current and former heavy smokers with a wide range of airflow limitation. (orig.)

  3. Applications of X-ray Computed Tomography and Emission Computed Tomography

    International Nuclear Information System (INIS)

    Seletchi, Emilia Dana; Sutac, Victor

    2005-01-01

    Computed Tomography is a non-destructive imaging method that allows visualization of internal features within non-transparent objects such as sedimentary rocks. Filtering techniques have been applied to circumvent the artifacts and achieve high-quality images for quantitative analysis. High-resolution X-ray computed tomography (HRXCT) can be used to identify the position of the growth axis in speleothems by detecting subtle changes in calcite density between growth bands. HRXCT imagery reveals the three-dimensional variability of coral banding providing information on coral growth and climate over the past several centuries. The Nuclear Medicine imaging technique uses a radioactive tracer, several radiation detectors, and sophisticated computer technologies to understand the biochemical basis of normal and abnormal functions within the brain. The goal of Emission Computed Tomography (ECT) is to accurately determine the three-dimensional radioactivity distribution resulting from the radiopharmaceutical uptake inside the patient instead of the attenuation coefficient distribution from different tissues as obtained from X-ray Computer Tomography. ECT is a very useful tool for investigating the cognitive functions. Because of the low radiation doses associated with Positron Emission Tomography (PET), this technique has been applied in clinical research, allowing the direct study of human neurological diseases. (authors)

  4. Passive Stretch Induces Structural and Functional Maturation of Engineered Heart Muscle as Predicted by Computational Modeling.

    Science.gov (United States)

    Abilez, Oscar J; Tzatzalos, Evangeline; Yang, Huaxiao; Zhao, Ming-Tao; Jung, Gwanghyun; Zöllner, Alexander M; Tiburcy, Malte; Riegler, Johannes; Matsa, Elena; Shukla, Praveen; Zhuge, Yan; Chour, Tony; Chen, Vincent C; Burridge, Paul W; Karakikes, Ioannis; Kuhl, Ellen; Bernstein, Daniel; Couture, Larry A; Gold, Joseph D; Zimmermann, Wolfram H; Wu, Joseph C

    2018-02-01

    The ability to differentiate human pluripotent stem cells (hPSCs) into cardiomyocytes (CMs) makes them an attractive source for repairing injured myocardium, disease modeling, and drug testing. Although current differentiation protocols yield hPSC-CMs to >90% efficiency, hPSC-CMs exhibit immature characteristics. With the goal of overcoming this limitation, we tested the effects of varying passive stretch on engineered heart muscle (EHM) structural and functional maturation, guided by computational modeling. Human embryonic stem cells (hESCs, H7 line) or human induced pluripotent stem cells (IMR-90 line) were differentiated to hPSC-derived cardiomyocytes (hPSC-CMs) in vitro using a small molecule based protocol. hPSC-CMs were characterized by troponin + flow cytometry as well as electrophysiological measurements. Afterwards, 1.2 × 10 6 hPSC-CMs were mixed with 0.4 × 10 6 human fibroblasts (IMR-90 line) (3:1 ratio) and type-I collagen. The blend was cast into custom-made 12-mm long polydimethylsiloxane reservoirs to vary nominal passive stretch of EHMs to 5, 7, or 9 mm. EHM characteristics were monitored for up to 50 days, with EHMs having a passive stretch of 7 mm giving the most consistent formation. Based on our initial macroscopic observations of EHM formation, we created a computational model that predicts the stress distribution throughout EHMs, which is a function of cellular composition, cellular ratio, and geometry. Based on this predictive modeling, we show cell alignment by immunohistochemistry and coordinated calcium waves by calcium imaging. Furthermore, coordinated calcium waves and mechanical contractions were apparent throughout entire EHMs. The stiffness and active forces of hPSC-derived EHMs are comparable with rat neonatal cardiomyocyte-derived EHMs. Three-dimensional EHMs display increased expression of mature cardiomyocyte genes including sarcomeric protein troponin-T, calcium and potassium ion channels, β-adrenergic receptors, and t

  5. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  6. Evaluation of the optimum region for mammographic system using computer simulation to study modulation transfer functions

    International Nuclear Information System (INIS)

    Oliveira, Isaura N. Sombra; Schiable, Homero; Porcel, Naider T.; Frere, Annie F.; Marques, Paulo M.A.

    1996-01-01

    An investigation of the 'optimum region' of the radiation field considering mammographic systems is studied. Such a region was defined in previous works as the field range where the system has its best performance and sharpest images. This study is based on a correlation of two methods for evaluating radiologic imaging systems, both using computer simulation in order to determine modulation transfer functions (MTFs) due to the X-ray tube focal spot in several field orientation and locations

  7. National Computer Security Conference Proceedings (10th): Computer Security--From Principles to Practices, 21-24 September 1987

    Science.gov (United States)

    1987-09-24

    conference ; heme -- Computer Securitj,." From Principles to Practices -- reflects the growth of computer security awareness and a maturation of the...Limited. current (North American) systems do not check whether de- clared functions are well-defined. An clemeitary example of an ill- defint -d function is

  8. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  9. Three pillars for achieving quantum mechanical molecular dynamics simulations of huge systems: Divide-and-conquer, density-functional tight-binding, and massively parallel computation.

    Science.gov (United States)

    Nishizawa, Hiroaki; Nishimura, Yoshifumi; Kobayashi, Masato; Irle, Stephan; Nakai, Hiromi

    2016-08-05

    The linear-scaling divide-and-conquer (DC) quantum chemical methodology is applied to the density-functional tight-binding (DFTB) theory to develop a massively parallel program that achieves on-the-fly molecular reaction dynamics simulations of huge systems from scratch. The functions to perform large scale geometry optimization and molecular dynamics with DC-DFTB potential energy surface are implemented to the program called DC-DFTB-K. A novel interpolation-based algorithm is developed for parallelizing the determination of the Fermi level in the DC method. The performance of the DC-DFTB-K program is assessed using a laboratory computer and the K computer. Numerical tests show the high efficiency of the DC-DFTB-K program, a single-point energy gradient calculation of a one-million-atom system is completed within 60 s using 7290 nodes of the K computer. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Tracing monadic computations and representing effects

    Directory of Open Access Journals (Sweden)

    Maciej Piróg

    2012-02-01

    Full Text Available In functional programming, monads are supposed to encapsulate computations, effectfully producing the final result, but keeping to themselves the means of acquiring it. For various reasons, we sometimes want to reveal the internals of a computation. To make that possible, in this paper we introduce monad transformers that add the ability to automatically accumulate observations about the course of execution as an effect. We discover that if we treat the resulting trace as the actual result of the computation, we can find new functionality in existing monads, notably when working with non-terminating computations.

  11. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    Science.gov (United States)

    Chan, Micaela Y; Haber, Sara; Drew, Linda M; Park, Denise C

    2016-06-01

    Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America.

  12. Cardiovascular measurement and cardiac function analysis with electron beam computed tomography in health Chinese people (50 cases report)

    International Nuclear Information System (INIS)

    Lu Bin; Dai Ruping; Zhang Shaoxiong; Bai Hua; Jing Baolian; Cao Cheng; He Sha; Ren Li

    1998-01-01

    Purpose: To quantitatively measure cardiovascular diameters and function parameters by using electron beam computed tomography, EBCT. Methods: Men 50 health Chinese people accepted EBCT common transverse and short-axis enhanced movie scan (27 men, 23 women, average age 47.7 years.). The transverse scan was used to measure the diameters of the ascending aorta, descending aorta, pulmonary artery and left atrium. The movie study was used to measure the left ventricular myocardium thickness and analysis global, sectional and segmental function of the right and left ventricles. Results: The cardiovascular diameters and cardiac functional parameters were calculated. The diameters and most functional parameters (end syspoble volume, syspole volume, ejection fraction, cardiac-output, cardiac index) of normal Chinese men were greater than those of women (P>0.05). However, the EDV and MyM(myocardium mass) of both ventricles were significant (p<0.01). Conclusion: EBCT is a minimally invasive method for cardiovascular measurement and cardiac function evaluation

  13. Computational Protein Design

    DEFF Research Database (Denmark)

    Johansson, Kristoffer Enøe

    Proteins are the major functional group of molecules in biology. The impact of protein science on medicine and chemical productions is rapidly increasing. However, the greatest potential remains to be realized. The fi eld of protein design has advanced computational modeling from a tool of support...... to a central method that enables new developments. For example, novel enzymes with functions not found in natural proteins have been de novo designed to give enough activity for experimental optimization. This thesis presents the current state-of-the-art within computational design methods together...... with a novel method based on probability theory. With the aim of assembling a complete pipeline for protein design, this work touches upon several aspects of protein design. The presented work is the computational half of a design project where the other half is dedicated to the experimental part...

  14. BCS Glossary of Computing and ICT

    CERN Document Server

    Panel, BCS Education and Training Expert; Burkhardt, Diana; Cumming, Aline; Hunter, Alan; Hurvid, Frank; Jaworski, John; Ng, Thomas; Scheer, Marianne; Southall, John; Vella, Alfred

    2008-01-01

    A glossary of computing designed to support those taking computer courses or courses where computers are used, including GCSE, A-Level, ECDL and 14-19 Diplomas in Functional Skills, in schools and Further Education colleges. It helps the reader build up knowledge and understanding of computing.

  15. Computer science a concise introduction

    CERN Document Server

    Sinclair, Ian

    2014-01-01

    Computer Science: A Concise Introduction covers the fundamentals of computer science. The book describes micro-, mini-, and mainframe computers and their uses; the ranges and types of computers and peripherals currently available; applications to numerical computation; and commercial data processing and industrial control processes. The functions of data preparation, data control, computer operations, applications programming, systems analysis and design, database administration, and network control are also encompassed. The book then discusses batch, on-line, and real-time systems; the basic

  16. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  17. Computational Modeling in Liver Surgery

    Directory of Open Access Journals (Sweden)

    Bruno Christ

    2017-11-01

    Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.

  18. Computationally simple, analytic, closed form solution of the Coulomb self-interaction problem in Kohn Sham density functional theory

    International Nuclear Information System (INIS)

    Gonis, Antonios; Daene, Markus W.; Nicholson, Don M.; Stocks, George Malcolm

    2012-01-01

    We have developed and tested in terms of atomic calculations an exact, analytic and computationally simple procedure for determining the functional derivative of the exchange energy with respect to the density in the implementation of the Kohn Sham formulation of density functional theory (KS-DFT), providing an analytic, closed-form solution of the self-interaction problem in KS-DFT. We demonstrate the efficacy of our method through ground-state calculations of the exchange potential and energy for atomic He and Be atoms, and comparisons with experiment and the results obtained within the optimized effective potential (OEP) method.

  19. Computational Investigation of the Geometrical and Electronic Structures of VGen-/0 (n = 1-4) Clusters by Density Functional Theory and Multiconfigurational CASSCF/CASPT2 Method.

    Science.gov (United States)

    Tran, Van Tan; Nguyen, Minh Thao; Tran, Quoc Tri

    2017-10-12

    Density functional theory and the multiconfigurational CASSCF/CASPT2 method have been employed to study the low-lying states of VGe n -/0 (n = 1-4) clusters. For VGe -/0 and VGe 2 -/0 clusters, the relative energies and geometrical structures of the low-lying states are reported at the CASSCF/CASPT2 level. For the VGe 3 -/0 and VGe 4 -/0 clusters, the computational results show that due to the large contribution of the Hartree-Fock exact exchange, the hybrid B3LYP, B3PW91, and PBE0 functionals overestimate the energies of the high-spin states as compared to the pure GGA BP86 and PBE functionals and the CASPT2 method. On the basis of the pure GGA BP86 and PBE functionals and the CASSCF/CASPT2 results, the ground states of anionic and neutral clusters are defined, the relative energies of the excited states are computed, and the electron detachment energies of the anionic clusters are evaluated. The computational results are employed to give new assignments for all features in the photoelectron spectra of VGe 3 - and VGe 4 - clusters.

  20. Videoprocessing with the MSX-computer

    International Nuclear Information System (INIS)

    Vliet, G.J. van.

    1988-01-01

    This report deals with the processing of video images with a Philips MSX-2 computer and is directed specifically onto the processing of the videosignals of the beamviewers. The final purpose is to create an extra control function which may be used for intuning the beam. This control function is established by mixing the video signals with a reference image from the computer. 7 figs

  1. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    Science.gov (United States)

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular

  2. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  3. Synthetic analog computation in living cells.

    Science.gov (United States)

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  4. Computational genomics of hyperthermophiles

    NARCIS (Netherlands)

    Werken, van de H.J.G.

    2008-01-01

    With the ever increasing number of completely sequenced prokaryotic genomes and the subsequent use of functional genomics tools, e.g. DNA microarray and proteomics, computational data analysis and the integration of microbial and molecular data is inevitable. This thesis describes the computational

  5. Exopolysaccharides enriched in rare sugars: bacterial sources, production, and applications

    OpenAIRE

    Roca, Christophe; Alves, Vitor D.; Freitas, Filomena; Reis, Maria A. M.

    2015-01-01

    The authors acknowledge Fundacao para a Ciencia e Tecnologia (FC&T), Portugal, through projects PEst-C/EQB/LA0006/2013 and PTDC/AGR-ALI/114706/2009 - "New edible bioactive coatings for the improvement of food products quality." FF acknowledges FCT&T for Post-Doctoral fellowship SFRH/BPD/72280/2010. Microbial extracellular polysaccharides (EPS), produced by a wide range of bacteria, are high molecular weight biopolymers, presenting an extreme diversity in terms of chemical structure and com...

  6. Computer Applications in the Design Process.

    Science.gov (United States)

    Winchip, Susan

    Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…

  7. Emission computed tomography

    International Nuclear Information System (INIS)

    Budinger, T.F.; Gullberg, G.T.; Huesman, R.H.

    1979-01-01

    This chapter is devoted to the methods of computer assisted tomography for determination of the three-dimensional distribution of gamma-emitting radionuclides in the human body. The major applications of emission computed tomography are in biological research and medical diagnostic procedures. The objectives of these procedures are to make quantitative measurements of in vivo biochemical and hemodynamic functions

  8. Computer technique for evaluating collimator performance

    International Nuclear Information System (INIS)

    Rollo, F.D.

    1975-01-01

    A computer program has been developed to theoretically evaluate the overall performance of collimators used with radioisotope scanners and γ cameras. The first step of the program involves the determination of the line spread function (LSF) and geometrical efficiency from the fundamental parameters of the collimator being evaluated. The working equations can be applied to any plane of interest. The resulting LSF is applied to subroutine computer programs which compute corresponding modulation transfer function and contrast efficiency functions. The latter function is then combined with appropriate geometrical efficiency data to determine the performance index function. The overall computer program allows one to predict from the physical parameters of the collimator alone how well the collimator will reproduce various sized spherical voids of activity in the image plane. The collimator performance program can be used to compare the performance of various collimator types, to study the effects of source depth on collimator performance, and to assist in the design of collimators. The theory of the collimator performance equation is discussed, a comparison between the experimental and theoretical LSF values is made, and examples of the application of the technique are presented

  9. Ergotic / epistemic / semiotic functions

    OpenAIRE

    Luciani , Annie

    2007-01-01

    International audience; Claude Cadoz has introduced a typology of human-environment relation, identifying three functions. This typology allows characterizing univocally, i.e. in a non-redundant manner, the computer devices and interfaces that allow human to interact with environment through and by computers. These three functions are: the epistemic function, the semiotic function, the ergotic function. Conversely to the terms epistemic and semiotic that are usual, the term ergotic has been s...

  10. Observation of high coercive fields in chemically synthesized coated Fe-Pt nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Dalavi, Shankar B.; Panda, Rabi N., E-mail: rnp@goa.bits-pilani.ac.in

    2017-04-15

    Nanocrystalline Fe-Pt alloys have been synthesized via chemical reduction route using various capping agents; such as: oleic acid/oleylamine (route-1) and oleic acid/CTAB (route-2). We could able to synthesize Fe50Pt and Fe54Pt alloys via route 1 and 2, respectively. As-prepared Fe-Pt alloys crystallize in disordered fcc phase with crystallite sizes of 2.3 nm and 6 nm for route-1 and route-2, respectively. Disordered Fe-Pt alloys were transformed to ordered fct phase after annealing at 600 °C. SEM studies confirm the spherical shape morphologies of annealed Fe-Pt nanoparticles with SEM particle sizes of 24.4 nm and 21.2 nm for route-1 and route-2, respectively. TEM study confirms the presence of 4.6 nm particles for annealed Fe50Pt alloys with several agglomerating clusters of bigger size and appropriately agrees well with the XRD study. Room temperature magnetization studies of as-prepared Fe-Pt alloys (fcc) show ferromagnetism with negligible coercivities. Average magnetic moments per particle for as-prepared Fe-Pt alloys were estimated to be 753 μ{sub B} and 814 μ{sub B}, for route 1 and 2, respectively. Ordered fct Fe-Pt alloys show high values of coercivities of 10,000 Oe and 10,792 Oe for route-1 and route-2, respectively. Observed magnetic properties of the fct Fe-Pt alloys nps were interpreted with the basis of order parameters, size, surface, and composition effects. - Highlights: • Synthesis of capped nanocrystalline Fe-Pt alloys via chemical routes. • Ordered fct phase were obtained at 600 °C. • Microstructural studies were carried out using SEM and TEM. • Investigation on evolution of magnetic properties from fcc to fct state. • Maximum values of coercivities up to 10,792 Oe were observed.

  11. Effects of development on indigenous dietary pattern: A Nigerian case study.

    Science.gov (United States)

    Ezeomah, Bookie; Farag, Karim

    2016-12-01

    The traditional foods of indigenous people in Nigeria are known for their cultural symbolism and agricultural biodiversity which contributes to their daily healthy and rich diet. In the early 90s, rapid development of the Federal Capital Territory (FCT) was noted and the resettlement of indigenes to other parts of the region was reported. These changes have facilitated the modification of indigenous diets, as indigenous groups rapidly embraced modern foods and also adopted the food culture of migrant ethnic groups. This has led to a gradual erosion of indigenous diets and traditional food systems in the FCT. This study explored the impact of development on traditional food systems and determined indigenes perception of the modification to their food culture as a result of the development of their land within the FCT. Field survey was carried out in four indigenous communities in the FCT (30 indigenes from each of the four areas) using structured questionnaires, Focus Group Discussions (FGDs) and key informant interviews. Person Chi Square analysis of indigenes socio-economic characteristics revealed significant relationships between gender of indigenes and farm size, Age and farm size, Educational level and farm/herd size. Qualitative analysis of FGDs revealed indigenes opinion on the socio-cultural changes in behaviour and food systems as a result of development. The study also identified indigenous youths as being most influenced by development especially through education, white collar jobs and social interactions with migrant ethnic groups in the FCT. The study recommended that indigenes should be provided with more secure land tenure and "back-to-farm" initiatives should be put in place by the Nigerian government to encourage indigenous youth to engaged more in agriculture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Application of invariant plane strain (IPS) theory to γ hydride formation in dilute Zr-Nb alloys

    International Nuclear Information System (INIS)

    Srivastava, D.; Neogy, S.; Dey, G.K.; Banerjee, S.; Ranganathan, S.

    2005-01-01

    The crystallographic aspects associated with the formation of the γ hydride phase (fct) from the α (hcp) phase and the β (bcc) phase in Zr-Nb alloys have been studied in two distinct situations, viz., in the α matrix in pure Zr and Zr-2.5Nb and in the β matrix in β stabilized Zr-20Nb alloy. The β-γ formation can be treated primarily as a simple shear on the basal plane involving a change in the stacking sequence. A possible mechanism for α-γ transformation has been presented in this paper. In this paper the β->γ transformation has been considered in terms of the invariant plane strain theory (IPS) in order to predict the crystallographic features of the γ hydride formed. The lattice invariant shear (LIS) (110) β [1-bar 10] β ||(111) γ [12-bar 1] γ has been considered and the crystallographic parameters associated with bcc->fct transformation, such as the habit plane and the magnitude of the LIS and the shape strain have been computed. The predictions made in the present analysis have been compared with experimentally observed habit planes. The α/γ and β/γ interface has been examined by high resolution transmission electron microscopy (HRTEM) technique to compare with the interfaces observed in martensitic transformations

  13. Development of a mobile gammacamera computer system for non invasive ventricular function determination

    International Nuclear Information System (INIS)

    Knopp, R.; Reske, S.N.; Winkler, C.

    1983-03-01

    As a reliable non-invasive method, dynamic ventricular volume determination by means of gammacamera computer scintigraphy is now generally accepted to be most useful in clinical cardiology. In view to the fact, however, that the required instrumentation is in general unwieldy and not mobile sophisticated cardiac function studies could not be performed up to now in many intensive care units. In order to overcome this problem we developed a compact scintigraphic system consisting of a mobile gammacamera (Siemens Mobicon) with a conductive build-in minicomputer (Siemens R 20: 16 bit, 128 kB). It renders possible a combined investigation of ventricular volume and pressure. The volume curve is acquired by sequential scintigrahpy whereas the pessure is simultaneously measured manometrically by means of heart catheter. As a result of this comprehensive investigation a pressure-volume loop is plottes the enclosed area of which represents the cardiac work performance. Additionally, functional parameters such as compliance (dV/dp) or stiffness (dp/dV) can be derived from the loop diagram. Besides of the mentioned procedures, the mobile system can also be used for detection of acute infarctions as well as for myocardial scintigraphy in general. (orig.) [de

  14. Studies on the zeros of Bessel functions and methods for their computation: 3. Some new works on monotonicity, convexity, and other properties

    Science.gov (United States)

    Kerimov, M. K.

    2016-12-01

    This paper continues the study of real zeros of Bessel functions begun in the previous parts of this work (see M. K. Kerimov, Comput. Math. Math. Phys. 54 (9), 1337-1388 (2014); 56 (7), 1175-1208 (2016)). Some new results regarding the monotonicity, convexity, concavity, and other properties of zeros are described. Additionally, the zeros of q-Bessel functions are investigated.

  15. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

    Science.gov (United States)

    Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

    2016-11-01

    High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

  16. The Use of Computer-Assisted Home Exercises to Preserve Physical Function after a Vestibular Rehabilitation Program: A Randomized Controlled Study

    DEFF Research Database (Denmark)

    Brandt, Michael Smærup; Læssøe, Uffe; Grönvall, Erik

    2016-01-01

    . Materials and Methods. Single-blind, randomized, controlled follow-up study. Fifty-seven elderly patients with chronic dizziness were randomly assigned to a computer-assisted home exercise program or to home exercises as described in printed instructions and followed for tree month after discharge from......, and quality of life three months following discharge from hospital. In this specific setup, no greater effect was found by introducing a computer-assisted training program, when compared to standard home training guided by printed instructions. This trial is registered with NCT01344408.......Objective. The purpose of this study was to evaluate whether elderly patients with vestibular dysfunction are able to preserve physical functional level, reduction in dizziness, and the patient's quality of life when assistive computer technology is used in comparison with printed instructions...

  17. Computing in an academic radiation therapy department

    International Nuclear Information System (INIS)

    Gottlieb, C.F.; Houdek, P.V.; Fayos, J.V.

    1985-01-01

    The authors conceptualized the different computer functions in radiotherapy as follows: 1) treatment planning and dosimetry, 2) data and word processing, 3) radiotherapy information system (data bank), 4) statistical analysis, 5) data acquisition and equipment control, 6) telecommunication, and 7) financial management. They successfully implemented the concept of distributed computing using multiple mini and personal computers. The authors' computer practice supports data and word processing, graphics, communication, automated data acquisition and control, and portable computing. The computers are linked together into a local computer network which permits sharing of information, peripherals, and unique programs among our systems, while preserving the individual function and identity of each machine. Furthermore, the architecture of our network allows direct access to any other computer network providing them with inexpensive use of the most modern and sophisticated software and hardware resources

  18. Facile synthesis of flower like FePt@ZnO core–shell structure and its bifunctional properties

    Energy Technology Data Exchange (ETDEWEB)

    Majeed, Jerina [Chemistry Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Jayakumar, O.D., E-mail: ddjaya@barc.gov.in [Chemistry Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Mandal, B.P. [Chemistry Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Salunke, H.G. [Technical Physics Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India); Naik, R. [Department of Physics, Wayne State University, Detroit, MI 48202 (United States); Tyagi, A.K., E-mail: aktyagi@barc.gov.in [Chemistry Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400085 (India)

    2014-06-01

    Graphical abstract: Flower shaped FePt and ZnO coated FePt with core–shell nanostructures are synthesized by a facile solvothermal procedure. Shell thickness of ZnO over FePt core was tuned by varying FePt concentration with respect to ZnO. Hybrid structure with lower FePt concentration exhibited bifunctionality such as near room temperature ferromagnetism and photoluminescence. Pristine FePt crystallize in the fct (L1{sub 0}) phase whereas it converts into fcc phase in presence of ZnO. - Highlights: • FePt@ZnO hybrid core–shell particles, with unique flower shape morphology have been prepared by solvothermal method. • Phase transition of fct-FePt to fcc-FePt has been found in presence of ZnO nanoparticles. • Plausible mechanism for growth of flowershaped nanoparticle is in accordance with energy minimization principle. • The core shell structure (FePt@ZnO) exhibits bi-functional properties. - Abstract: Flower shaped FePt and ZnO coated FePt (FePt@ZnO) core–shell nanostructures are synthesized by a facile solvothermal procedure. Two different compositions (molar ratio) of FePt and ZnO (FePt:ZnO = 1:3 and FePt:ZnO = 1:6) core–shells with different thicknesses of ZnO shells were synthesized. Hybrid FePt@ZnO core–shell flower structure with lower FePt concentration (FePt:ZnO = 1:6) exhibited bifunctionality including near room temperature ferromagnetism and photoluminescence at ambient conditions. X-ray diffraction patterns of pristine FePt showed partially ordered face centred tetragonal (fct) L1{sub 0} phase whereas ZnO coated FePt (FePt@ZnO) nanostructures showed hexagonal ZnO and disordered phase of FePt with fcc structure. The phase transition of fct FePt to fcc phase occurring in presence of ZnO is further confirmed by transmission electron microscopy and magnetic measurement studies. The formation of the nanoflowers was possibly due to growth along the [0 1 1] or [0 0 1] direction, keeping the core nearly spherical in accordance with the

  19. A tree-decomposed transfer matrix for computing exact Potts model partition functions for arbitrary graphs, with applications to planar graph colourings

    International Nuclear Information System (INIS)

    Bedini, Andrea; Jacobsen, Jesper Lykke

    2010-01-01

    Combining tree decomposition and transfer matrix techniques provides a very general algorithm for computing exact partition functions of statistical models defined on arbitrary graphs. The algorithm is particularly efficient in the case of planar graphs. We illustrate it by computing the Potts model partition functions and chromatic polynomials (the number of proper vertex colourings using Q colours) for large samples of random planar graphs with up to N = 100 vertices. In the latter case, our algorithm yields a sub-exponential average running time of ∼ exp(1.516√N), a substantial improvement over the exponential running time ∼exp (0.245N) provided by the hitherto best-known algorithm. We study the statistics of chromatic roots of random planar graphs in some detail, comparing the findings with results for finite pieces of a regular lattice.

  20. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    Science.gov (United States)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  1. The relationship between lung function impairment and quantitative computed tomography in chronic obstructive pulmonary disease

    Energy Technology Data Exchange (ETDEWEB)

    Mets, O.M. [Radiology, University Medical Center Utrecht (Netherlands); University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Murphy, K. [Image Sciences Institute, University Medical Center Utrecht (Netherlands); Zanen, P.; Lammers, J.W. [Pulmonology, University Medical Center Utrecht (Netherlands); Gietema, H.A.; Jong, P.A. de [Radiology, University Medical Center Utrecht (Netherlands); Ginneken, B. van [Image Sciences Institute, University Medical Center Utrecht (Netherlands); Radboud University Nijmegen Medical Centre, Diagnostic Image Analysis Group, Radiology, Nijmegen (Netherlands); Prokop, M. [Radiology, University Medical Center Utrecht (Netherlands); Radiology, Radboud University Nijmegen Medical Centre (Netherlands)

    2012-01-15

    To determine the relationship between lung function impairment and quantitative computed tomography (CT) measurements of air trapping and emphysema in a population of current and former heavy smokers with and without airflow limitation. In 248 subjects (50 normal smokers; 50 mild obstruction; 50 moderate obstruction; 50 severe obstruction; 48 very severe obstruction) CT emphysema and CT air trapping were quantified on paired inspiratory and end-expiratory CT examinations using several available quantification methods. CT measurements were related to lung function (FEV{sub 1}, FEV{sub 1}/FVC, RV/TLC, Kco) by univariate and multivariate linear regression analysis. Quantitative CT measurements of emphysema and air trapping were strongly correlated to airflow limitation (univariate r-squared up to 0.72, p < 0.001). In multivariate analysis, the combination of CT emphysema and CT air trapping explained 68-83% of the variability in airflow limitation in subjects covering the total range of airflow limitation (p < 0.001). The combination of quantitative CT air trapping and emphysema measurements is strongly associated with lung function impairment in current and former heavy smokers with a wide range of airflow limitation. (orig.)

  2. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  3. Use of time space Green's functions in the computation of transient eddy current fields

    International Nuclear Information System (INIS)

    Davey, K.; Turner, L.

    1988-01-01

    The utility of integral equations to solve eddy current problems has been borne out by numerous computations in the past few years, principally in sinusoidal steady-state problems. This paper attempts to examine the applicability of the integral approaches in both time and space for the more generic transient problem. The basic formulation for the time space Green's function approach is laid out. A technique employing Gauss-Laguerre integration is employed to realize the temporal solution, while Gauss--Legendre integration is used to resolve the spatial field character. The technique is then applied to the fusion electromagnetic induction experiments (FELIX) cylinder experiments in both two and three dimensions. It is found that quite accurate solutions can be obtained using rather coarse time steps and very few unknowns; the three-dimensional field solution worked out in this context used basically only four unknowns. The solution appears to be somewhat sensitive to the choice of time step, a consequence of a numerical instability imbedded in the Green's function near the origin

  4. Design, functioning and possible applications of process computers

    International Nuclear Information System (INIS)

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  5. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  6. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    Science.gov (United States)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  7. The computer graphics interface

    CERN Document Server

    Steinbrugge Chauveau, Karla; Niles Reed, Theodore; Shepherd, B

    2014-01-01

    The Computer Graphics Interface provides a concise discussion of computer graphics interface (CGI) standards. The title is comprised of seven chapters that cover the concepts of the CGI standard. Figures and examples are also included. The first chapter provides a general overview of CGI; this chapter covers graphics standards, functional specifications, and syntactic interfaces. Next, the book discusses the basic concepts of CGI, such as inquiry, profiles, and registration. The third chapter covers the CGI concepts and functions, while the fourth chapter deals with the concept of graphic obje

  8. Proteins of unknown function in the Protein Data Bank (PDB): an inventory of true uncharacterized proteins and computational tools for their analysis.

    Science.gov (United States)

    Nadzirin, Nurul; Firdaus-Raih, Mohd

    2012-10-08

    Proteins of uncharacterized functions form a large part of many of the currently available biological databases and this situation exists even in the Protein Data Bank (PDB). Our analysis of recent PDB data revealed that only 42.53% of PDB entries (1084 coordinate files) that were categorized under "unknown function" are true examples of proteins of unknown function at this point in time. The remainder 1465 entries also annotated as such appear to be able to have their annotations re-assessed, based on the availability of direct functional characterization experiments for the protein itself, or for homologous sequences or structures thus enabling computational function inference.

  9. Multimetallic nanoparticle catalysts with enhanced electrooxidation

    Science.gov (United States)

    Sun, Shouheng; Zhang, Sen; Zhu, Huiyuan; Guo, Shaojun

    2015-07-28

    A new structure-control strategy to optimize nanoparticle catalysis is provided. The presence of Au in FePtAu facilitates FePt structure transformation from chemically disordered face centered cubic (fcc) structure to chemically ordered face centered tetragonal (fct) structure, and further promotes formic acid oxidation reaction (FAOR). The fct-FePtAu nanoparticles show high CO poisoning resistance, achieve mass activity as high as about 2810 mA/mg Pt, and retain greater than 90% activity after a 13 hour stability test.

  10. Selective bond cleavage in potassium collisions with pyrimidine bases of DNA

    OpenAIRE

    Almeida, Diogo; Ferreira da Silva, F.; García, Gustavo; Limão-Vieira, P.

    2013-01-01

    Portuguese Foundation for Science and Technology (FCT-MEC) (SFRH/BD/61645/2009) FCT-MEC (PEst-OE/FIS/UI0068/2011) Spanish Ministerio de Economia y Competitividad (FIS 2009-10245; SFRH/BPD/68979/2010) Electron transfer in alkali-molecule collisions to gas phase thymine and uracil yielding H- formation is selectively controlled in the energy range between 5.3 and 66.1 eV. By tuning the collision energy, electron transfer from the alkali to partly deuterated thymine, methylated thymine at the...

  11. A comparative approach for the investigation of biological information processing: An examination of the structure and function of computer hard drives and DNA

    OpenAIRE

    D'Onofrio, David J; An, Gary

    2010-01-01

    Abstract Background The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these pr...

  12. MACKLIB-IV: a library of nuclear response functions generated with the MACK-IV computer program from ENDF/B-IV

    International Nuclear Information System (INIS)

    Gohar, Y.; Abdou, M.A.

    1978-03-01

    MACKLIB-IV employs the CTR energy group structure of 171 neutron groups and 36 gamma groups. A retrieval computer program is included with the library to permit collapsing into any other energy group structure. The library is in the new format of the ''MACK-Activity Table'' which uses a fixed position for each specific response function. This permits the user when employing the library with present transport codes to obtain directly the nuclear responses (e.g. the total nuclear heating) summed for all isotopes and integrated over any geometrical volume. The response functions included in the library are neutron kerma factor, gamma kerma factor, gas production and tritium-breeding functions, and all important reaction cross sections. Pertinent information about the library and a graphical display of six response functions for all materials in the library are given

  13. Computer-aided training sensorimotor cortex functions in humans before the upper limb transplantation using virtual reality and sensory feedback.

    Science.gov (United States)

    Kurzynski, Marek; Jaskolska, Anna; Marusiak, Jaroslaw; Wolczowski, Andrzej; Bierut, Przemyslaw; Szumowski, Lukasz; Witkowski, Jerzy; Kisiel-Sajewicz, Katarzyna

    2017-08-01

    One of the biggest problems of upper limb transplantation is lack of certainty as to whether a patient will be able to control voluntary movements of transplanted hands. Based on findings of the recent research on brain cortex plasticity, a premise can be drawn that mental training supported with visual and sensory feedback can cause structural and functional reorganization of the sensorimotor cortex, which leads to recovery of function associated with the control of movements performed by the upper limbs. In this study, authors - based on the above observations - propose the computer-aided training (CAT) system, which generating visual and sensory stimuli, should enhance the effectiveness of mental training applied to humans before upper limb transplantation. The basis for the concept of computer-aided training system is a virtual hand whose reaching and grasping movements the trained patient can observe on the VR headset screen (visual feedback) and whose contact with virtual objects the patient can feel as a touch (sensory feedback). The computer training system is composed of three main components: (1) the system generating 3D virtual world in which the patient sees the virtual limb from the perspective as if it were his/her own hand; (2) sensory feedback transforming information about the interaction of the virtual hand with the grasped object into mechanical vibration; (3) the therapist's panel for controlling the training course. Results of the case study demonstrate that mental training supported with visual and sensory stimuli generated by the computer system leads to a beneficial change of the brain activity related to motor control of the reaching in the patient with bilateral upper limb congenital transverse deficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Computing and cognition in future power plant operations

    International Nuclear Information System (INIS)

    Kisner, R.A.; Sheridan, T.B.

    1983-01-01

    The intent of this paper is to speculate on the nature of future interactions between people and computers in the operation of power plants. In particular, the authors offer a taxonomy for examining the differing functions of operators in interacting with the plant and its computers, and the differing functions of the computers in interacting with the plant and its operators

  15. Computing and cognition in future power-plant operations

    International Nuclear Information System (INIS)

    Kisner, R.A.; Sheridan, T.B.

    1983-01-01

    The intent of this paper is to speculate on the nature of future interactions between people and computers in the operation of power plants. In particular, the authors offer a taxonomy for examining the differing functions of operators in interacting with the plant and its computers, and the differing functions of the computers in interacting with the plant and its operators

  16. The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)

    Science.gov (United States)

    2017-09-01

    The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information

  17. GammaCHI: a package for the inversion and computation of the gamma and chi-square cumulative distribution functions (central and noncentral)

    NARCIS (Netherlands)

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2015-01-01

    textabstractA Fortran 90 module GammaCHI for computing and inverting the gamma and chi-square cumulative distribution functions (central and noncentral) is presented. The main novelty of this package is the reliable and accurate inversion routines for the noncentral cumulative distribution

  18. Quantitative computed tomography for the prediction of pulmonary function after lung cancer surgery: a simple method using simulation software.

    Science.gov (United States)

    Ueda, Kazuhiro; Tanaka, Toshiki; Li, Tao-Sheng; Tanaka, Nobuyuki; Hamano, Kimikazu

    2009-03-01

    The prediction of pulmonary functional reserve is mandatory in therapeutic decision-making for patients with resectable lung cancer, especially those with underlying lung disease. Volumetric analysis in combination with densitometric analysis of the affected lung lobe or segment with quantitative computed tomography (CT) helps to identify residual pulmonary function, although the utility of this modality needs investigation. The subjects of this prospective study were 30 patients with resectable lung cancer. A three-dimensional CT lung model was created with voxels representing normal lung attenuation (-600 to -910 Hounsfield units). Residual pulmonary function was predicted by drawing a boundary line between the lung to be preserved and that to be resected, directly on the lung model. The predicted values were correlated with the postoperative measured values. The predicted and measured values corresponded well (r=0.89, plung cancer surgery and helped to identify patients whose functional reserves are likely to be underestimated. Hence, this modality should be utilized for patients with marginal pulmonary function.

  19. Secure Two-Party Computation with Low Communication

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Faust, Sebastian; Hazay, Carmit

    2011-01-01

    We propose a 2-party UC-secure computation protocol that can compute any function securely. The protocol requires only two messages, communication that is poly-logarithmic in the size of the circuit description of the function, and the workload for one of the parties is also only poly-logarithmic...

  20. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  1. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  2. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  3. Comparison and selection of client computer in nuclear instrument

    International Nuclear Information System (INIS)

    Ma Guizhen; Xie Yanhui; Peng Jing; Xu Feiyan

    2012-01-01

    The function of modern new nuclear instrument is very much. And the information degree is high requested. Through close matching for host computer and client computer, the data processing function can be carried out. This article puts forward a few of projects for the client computer of general nuclear instrument. The function and features of several common client computers, such as FPGA, ARM and DSP, are analyzed and compared. The applied scope is discussed also. At the same time, using a practical design as an example, the selection ideas of client computer are described. This article can be used for reference for the hardware design of data acquisition processing unit in nuclear instrument. (authors)

  4. Five-year clinical and functional multislice computed tomography angiographic results after coronary implantation of the fully resorbable polymeric everolimus-eluting scaffold in patients with de novo coronary artery disease

    DEFF Research Database (Denmark)

    Onuma, Yoshinobu; Dudek, Dariusz; Thuesen, Leif

    2013-01-01

    This study sought to demonstrate the 5-year clinical and functional multislice computed tomography angiographic results after implantation of the fully resorbable everolimus-eluting scaffold (Absorb BVS, Abbott Vascular, Santa Clara, California).......This study sought to demonstrate the 5-year clinical and functional multislice computed tomography angiographic results after implantation of the fully resorbable everolimus-eluting scaffold (Absorb BVS, Abbott Vascular, Santa Clara, California)....

  5. Technical Note. The Concept of a Computer System for Interpretation of Tight Rocks Using X-Ray Computed Tomography Results

    Directory of Open Access Journals (Sweden)

    Habrat Magdalena

    2017-03-01

    Full Text Available The article presents the concept of a computer system for interpreting unconventional oil and gas deposits with the use of X-ray computed tomography results. The functional principles of the solution proposed are presented in the article. The main goal is to design a product which is a complex and useful tool in a form of a specialist computer software for qualitative and quantitative interpretation of images obtained from X-ray computed tomography. It is devoted to the issues of prospecting and identification of unconventional hydrocarbon deposits. The article focuses on the idea of X-ray computed tomography use as a basis for the analysis of tight rocks, considering especially functional principles of the system, which will be developed by the authors. The functional principles include the issues of graphical visualization of rock structure, qualitative and quantitative interpretation of model for visualizing rock samples, interpretation and a description of the parameters within realizing the module of quantitative interpretation.

  6. Ordinateur et communication (Computer and Communication).

    Science.gov (United States)

    Mangenot, Francois

    1994-01-01

    Because use of computers in second-language classrooms may tend to decrease interpersonal interaction, and therefore communication, ways to promote interaction are offered. These include small group computer projects, and suggestions are made for use with various computer functions and features: tutorials, word processing, voice recording,…

  7. Structure problems in the analog computation

    International Nuclear Information System (INIS)

    Braffort, P.L.

    1957-01-01

    The recent mathematical development showed the importance of elementary structures (algebraic, topological, etc.) in abeyance under the great domains of classical analysis. Such structures in analog computation are put in evidence and possible development of applied mathematics are discussed. It also studied the topological structures of the standard representation of analog schemes such as additional triangles, integrators, phase inverters and functions generators. The analog method gives only the function of the variable: time, as results of its computations. But the course of computation, for systems including reactive circuits, introduces order structures which are called 'chronological'. Finally, it showed that the approximation methods of ordinary numerical and digital computation present the same structure as these analog computation. The structure analysis permits fruitful comparisons between the several domains of applied mathematics and suggests new important domains of application for analog method. (M.P.)

  8. Computation at the edge of chaos: Phase transition and emergent computation

    International Nuclear Information System (INIS)

    Langton, C.

    1990-01-01

    In order for computation to emerge spontaneously and become an important factor in the dynamics of a system, the material substrate must support the primitive functions required for computation: the transmission, storage, and modification of information. Under what conditions might we expect physical systems to support such computational primitives? This paper presents research on Cellular Automata which suggests that the optimal conditions for the support of information transmission, storage, and modification, are achieved in the vicinity of a phase transition. We observe surprising similarities between the behaviors of computations and systems near phase-transitions, finding analogs of computational complexity classes and the Halting problem within the phenomenology of phase-transitions. We conclude that there is a fundamental connection between computation and phase-transitions, and discuss some of the implications for our understanding of nature if such a connection is borne out. 31 refs., 16 figs

  9. Basic principles of computers

    International Nuclear Information System (INIS)

    Royal, H.D.; Parker, J.A.; Holmen, B.L.

    1988-01-01

    This chapter presents preliminary concepts of computer operations. It describes the hardware used in a nuclear medicine computer system. It discusses the software necessary for acquisition and analysis of nuclear medicine studies. The chapter outlines the integrated package of hardware and software that is necessary to perform specific functions in nuclear medicine

  10. A Robust Optimization Based Energy-Aware Virtual Network Function Placement Proposal for Small Cell 5G Networks with Mobile Edge Computing Capabilities

    OpenAIRE

    Blanco, Bego; Taboada, Ianire; Fajardo, Jose Oscar; Liberal, Fidel

    2017-01-01

    In the context of cloud-enabled 5G radio access networks with network function virtualization capabilities, we focus on the virtual network function placement problem for a multitenant cluster of small cells that provide mobile edge computing services. Under an emerging distributed network architecture and hardware infrastructure, we employ cloud-enabled small cells that integrate microservers for virtualization execution, equipped with additional hardware appliances. We develop an energy-awa...

  11. Present SLAC accelerator computer control system features

    International Nuclear Information System (INIS)

    Davidson, V.; Johnson, R.

    1981-02-01

    The current functional organization and state of software development of the computer control system of the Stanford Linear Accelerator is described. Included is a discussion of the distribution of functions throughout the system, the local controller features, and currently implemented features of the touch panel portion of the system. The functional use of our triplex of PDP11-34 computers sharing common memory is described. Also included is a description of the use of pseudopanel tables as data tables for closed loop control functions

  12. COPD phenotypes on computed tomography and its correlation with selected lung function variables in severe patients

    Directory of Open Access Journals (Sweden)

    da Silva SMD

    2016-03-01

    Full Text Available Silvia Maria Doria da Silva, Ilma Aparecida Paschoal, Eduardo Mello De Capitani, Marcos Mello Moreira, Luciana Campanatti Palhares, Mônica Corso PereiraPneumology Service, Department of Internal Medicine, School of Medical Sciences, State University of Campinas (UNICAMP, Campinas, São Paulo, BrazilBackground: Computed tomography (CT phenotypic characterization helps in understanding the clinical diversity of chronic obstructive pulmonary disease (COPD patients, but its clinical relevance and its relationship with functional features are not clarified. Volumetric capnography (VC uses the principle of gas washout and analyzes the pattern of CO2 elimination as a function of expired volume. The main variables analyzed were end-tidal concentration of carbon dioxide (ETCO2, Slope of phase 2 (Slp2, and Slope of phase 3 (Slp3 of capnogram, the curve which represents the total amount of CO2 eliminated by the lungs during each breath.Objective: To investigate, in a group of patients with severe COPD, if the phenotypic analysis by CT could identify different subsets of patients, and if there was an association of CT findings and functional variables.Subjects and methods: Sixty-five patients with COPD Gold III–IV were admitted for clinical evaluation, high-resolution CT, and functional evaluation (spirometry, 6-minute walk test [6MWT], and VC. The presence and profusion of tomography findings were evaluated, and later, the patients were identified as having emphysema (EMP or airway disease (AWD phenotype. EMP and AWD groups were compared; tomography findings scores were evaluated versus spirometric, 6MWT, and VC variables.Results: Bronchiectasis was found in 33.8% and peribronchial thickening in 69.2% of the 65 patients. Structural findings of airways had no significant correlation with spirometric variables. Air trapping and EMP were strongly correlated with VC variables, but in opposite directions. There was some overlap between the EMP and AWD

  13. Technical and functional analysis of Spanish windmills: 3D modeling, computational-fluid-dynamics simulation and finite-element analysis

    International Nuclear Information System (INIS)

    Rojas-Sola, José Ignacio; Bouza-Rodríguez, José Benito; Menéndez-Díaz, Agustín

    2016-01-01

    Highlights: • Technical and functional analysis of the two typologies of windmills in Spain. • Spatial distribution of velocities and pressures by computational-fluid dynamics (CFD). • Finite-element analysis (FEA) of the rotors of these two types of windmills. • Validation of the operative functionality of these windmills. - Abstract: A detailed study has been made of the two typologies of windmills in Spain, specifically the rectangular-bladed type, represented by the windmill ‘Sardinero’, located near the town of Campo de Criptana (Ciudad Real province, Spain) and the type with triangular sails (lateens), represented by the windmill ‘San Francisco’, in the town of Vejer de la Frontera (Cádiz province, Spain). For this, an ad hoc research methodology has been applied on the basis of three aspects: three-dimensional geometric modeling, analysis by computational-fluid dynamics (CFD), and finite-element analysis (FEA). The results found with the CFD technique show the correct functioning of the two windmills in relation to the spatial distribution of the wind velocities and pressures to which each is normally exposed (4–7 m/s in the case of ‘Sardinero’, and 5–11 for ‘San Francisco’), thereby validating the operative functionality of both types. In addition, as a result of the FEA, the spatial distribution of stresses on the rotor has revealed that the greatest concentrations of these occurs in the teeth of the head wheel in ‘Sardinero’, reaching a value of 12 MPa, and at the base of the masts in the case of the ‘San Francisco’, with a value of 24 MPa. Also, this analysis evidences that simple, effective designs to reinforce the masts absorb a great concentration of stresses that would otherwise cause breakage. Furthermore, it was confirmed that the oak wood from which the rotors were made functioned properly, as the windmill never exceeded the maximum admissible working stress, demonstrating the effectiveness of the materials

  14. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    Science.gov (United States)

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  15. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  16. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  17. Mirror neurons and imitation: a computationally guided review.

    Science.gov (United States)

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  18. Progress and challenges in the computational prediction of gene function using networks [v1; ref status: indexed, http://f1000r.es/SqmJUM

    Directory of Open Access Journals (Sweden)

    Paul Pavlidis

    2012-09-01

    Full Text Available In this opinion piece, we attempt to unify recent arguments we have made that serious confounds affect the use of network data to predict and characterize gene function. The development of computational approaches to determine gene function is a major strand of computational genomics research. However, progress beyond using BLAST to transfer annotations has been surprisingly slow. We have previously argued that a large part of the reported success in using "guilt by association" in network data is due to the tendency of methods to simply assign new functions to already well-annotated genes. While such predictions will tend to be correct, they are generic; it is true, but not very helpful, that a gene with many functions is more likely to have any function. We have also presented evidence that much of the remaining performance in cross-validation cannot be usefully generalized to new predictions, making progressive improvement in analysis difficult to engineer. Here we summarize our findings about how these problems will affect network analysis, discuss some ongoing responses within the field to these issues, and consolidate some recommendations and speculation, which we hope will modestly increase the reliability and specificity of gene function prediction.

  19. A computer-assisted test for the electrophysiological and psychophysical measurement of dynamic visual function based on motion contrast.

    Science.gov (United States)

    Wist, E R; Ehrenstein, W H; Schrauf, M; Schraus, M

    1998-03-13

    A new test is described that allows for electrophysiological and psychophysical measurement of visual function based on motion contrast. In a computer-generated random-dot display, completely camouflaged Landolt rings become visible only when dots within the target area are moved briefly while those of the background remain stationary. Thus, detection of contours and the location of the gap in the ring rely on motion contrast (form-from-motion) instead of luminance contrast. A standard version of this test has been used to assess visual performance in relation to age, in screening professional groups (truck drivers) and in clinical groups (glaucoma patients). Aside from this standard version, the computer program easily allows for various modifications. These include the option of a synchronizing trigger signal to allow for recording of time-locked motion-onset visual-evoked responses, the reversal of target and background motion, and the displacement of random-dot targets across stationary backgrounds. In all instances, task difficulty is manipulated by changing the percentage of moving dots within the target (or background). The present test offers a short, convenient method to probe dynamic visual functions relying on surprathreshold motion-contrast stimuli and complements other routine tests of form, contrast, depth, and color vision.

  20. Proteins of Unknown Function in the Protein Data Bank (PDB: An Inventory of True Uncharacterized Proteins and Computational Tools for Their Analysis

    Directory of Open Access Journals (Sweden)

    Nurul Nadzirin

    2012-10-01

    Full Text Available Proteins of uncharacterized functions form a large part of many of the currently available biological databases and this situation exists even in the Protein Data Bank (PDB. Our analysis of recent PDB data revealed that only 42.53% of PDB entries (1084 coordinate files that were categorized under “unknown function” are true examples of proteins of unknown function at this point in time. The remainder 1465 entries also annotated as such appear to be able to have their annotations re-assessed, based on the availability of direct functional characterization experiments for the protein itself, or for homologous sequences or structures thus enabling computational function inference.

  1. An Applet-based Anonymous Distributed Computing System.

    Science.gov (United States)

    Finkel, David; Wills, Craig E.; Ciaraldi, Michael J.; Amorin, Kevin; Covati, Adam; Lee, Michael

    2001-01-01

    Defines anonymous distributed computing systems and focuses on the specifics of a Java, applet-based approach for large-scale, anonymous, distributed computing on the Internet. Explains the possibility of a large number of computers participating in a single computation and describes a test of the functionality of the system. (Author/LRW)

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. Computing Cumulative Interest and Principal Paid For a Calendar Year

    OpenAIRE

    John O. MASON

    2011-01-01

    This paper demonstrates how easy it is use Microsoft Excel’s CUMPRINC and CUMIPMT functions to compute principal and interest paid for an entire year, even though the payments were made monthly. The CUMPRINC function computes the principal paid by a series of loan payments; the CUMIPMT function computes the interest paid. These two functions provide an alternative to preparing a monthly loan amortization schedule and adding up the amounts of monthly interest paid and principal paid for the ye...

  4. Computational methods for reversed-field equilibrium

    International Nuclear Information System (INIS)

    Boyd, J.K.; Auerbach, S.P.; Willmann, P.A.; Berk, H.L.; McNamara, B.

    1980-01-01

    Investigating the temporal evolution of reversed-field equilibrium caused by transport processes requires the solution of the Grad-Shafranov equation and computation of field-line-averaged quantities. The technique for field-line averaging and the computation of the Grad-Shafranov equation are presented. Application of Green's function to specify the Grad-Shafranov equation boundary condition is discussed. Hill's vortex formulas used to verify certain computations are detailed. Use of computer software to implement computational methods is described

  5. The Effect of Functional Roles on Group Efficiency : Using Multilevel Modeling and Content Analysis to Investigate Computer-Supported Collaboration in Small Groups

    NARCIS (Netherlands)

    Strijbos, J.W.; Martens, R.L.; Jochems, W.M.G.; Broers, N.J.

    2004-01-01

    The usefulness of roles to support small group performance can often be read; however, their effect is rarely empirically assessed. This article reports the effects of functional roles on group performance, efficiency, and collaboration during computer-supported collaborative learning. A comparison

  6. A functional RG equation for the c-function

    DEFF Research Database (Denmark)

    Codello, A.; D'Odorico, G.; Pagani, C.

    2014-01-01

    , local potential approximation and loop expansion. In each case we construct the relative approximate c-function and find it to be consistent with Zamolodchikov's c-theorem. Finally, we present a relation between the c-function and the (matter induced) beta function of Newton's constant, allowing us...... to use heat kernel techniques to compute the RG running of the c-function....

  7. Efficiently outsourcing multiparty computation under multiple keys

    NARCIS (Netherlands)

    Peter, Andreas; Tews, Erik; Tews, Erik; Katzenbeisser, Stefan

    2013-01-01

    Secure multiparty computation enables a set of users to evaluate certain functionalities on their respective inputs while keeping these inputs encrypted throughout the computation. In many applications, however, outsourcing these computations to an untrusted server is desirable, so that the server

  8. A History of Computer Numerical Control.

    Science.gov (United States)

    Haggen, Gilbert L.

    Computer numerical control (CNC) has evolved from the first significant counting method--the abacus. Babbage had perhaps the greatest impact on the development of modern day computers with his analytical engine. Hollerith's functioning machine with punched cards was used in tabulating the 1890 U.S. Census. In order for computers to become a…

  9. Michael Levitt and Computational Biology

    Science.gov (United States)

    dropdown arrow Site Map A-Z Index Menu Synopsis Michael Levitt and Computational Biology Resources with Michael Levitt, PhD, professor of structural biology at the Stanford University School of Medicine, has function. ... Levitt's early work pioneered computational structural biology, which helped to predict

  10. Theoretical computer science and the natural sciences

    Science.gov (United States)

    Marchal, Bruno

    2005-12-01

    I present some fundamental theorems in computer science and illustrate their relevance in Biology and Physics. I do not assume prerequisites in mathematics or computer science beyond the set N of natural numbers, functions from N to N, the use of some notational conveniences to describe functions, and at some point, a minimal amount of linear algebra and logic. I start with Cantor's transcendental proof by diagonalization of the non enumerability of the collection of functions from natural numbers to the natural numbers. I explain why this proof is not entirely convincing and show how, by restricting the notion of function in terms of discrete well defined processes, we are led to the non algorithmic enumerability of the computable functions, but also-through Church's thesis-to the algorithmic enumerability of partial computable functions. Such a notion of function constitutes, with respect to our purpose, a crucial generalization of that concept. This will make easy to justify deep and astonishing (counter-intuitive) incompleteness results about computers and similar machines. The modified Cantor diagonalization will provide a theory of concrete self-reference and I illustrate it by pointing toward an elementary theory of self-reproduction-in the Amoeba's way-and cellular self-regeneration-in the flatworm Planaria's way. To make it easier, I introduce a very simple and powerful formal system known as the Schoenfinkel-Curry combinators. I will use the combinators to illustrate in a more concrete way the notion introduced above. The combinators, thanks to their low-level fine grained design, will also make it possible to make a rough but hopefully illuminating description of the main lessons gained by the careful observation of nature, and to describe some new relations, which should exist between computer science, the science of life and the science of inert matter, once some philosophical, if not theological, hypotheses are made in the cognitive sciences. In the

  11. Digital computer structure and design

    CERN Document Server

    Townsend, R

    2014-01-01

    Digital Computer Structure and Design, Second Edition discusses switching theory, counters, sequential circuits, number representation, and arithmetic functions The book also describes computer memories, the processor, data flow system of the processor, the processor control system, and the input-output system. Switching theory, which is purely a mathematical concept, centers on the properties of interconnected networks of ""gates."" The theory deals with binary functions of 1 and 0 which can change instantaneously from one to the other without intermediate values. The binary number system is

  12. On reversible Turing machines and their function universality

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2016-01-01

    We provide a treatment of the reversible Turing machines (RTMs) under a strict function semantics. Unlike many existing reversible computation models, we distinguish strictly between computing the function backslashlambda x.f(x) $ x . f ( x ) and computing the function backslashlambda x. (x, f(x)...

  13. Ensemble-based computational approach discriminates functional activity of p53 cancer and rescue mutants.

    Directory of Open Access Journals (Sweden)

    Özlem Demir

    2011-10-01

    Full Text Available The tumor suppressor protein p53 can lose its function upon single-point missense mutations in the core DNA-binding domain ("cancer mutants". Activity can be restored by second-site suppressor mutations ("rescue mutants". This paper relates the functional activity of p53 cancer and rescue mutants to their overall molecular dynamics (MD, without focusing on local structural details. A novel global measure of protein flexibility for the p53 core DNA-binding domain, the number of clusters at a certain RMSD cutoff, was computed by clustering over 0.7 µs of explicitly solvated all-atom MD simulations. For wild-type p53 and a sample of p53 cancer or rescue mutants, the number of clusters was a good predictor of in vivo p53 functional activity in cell-based assays. This number-of-clusters (NOC metric was strongly correlated (r(2 = 0.77 with reported values of experimentally measured ΔΔG protein thermodynamic stability. Interpreting the number of clusters as a measure of protein flexibility: (i p53 cancer mutants were more flexible than wild-type protein, (ii second-site rescue mutations decreased the flexibility of cancer mutants, and (iii negative controls of non-rescue second-site mutants did not. This new method reflects the overall stability of the p53 core domain and can discriminate which second-site mutations restore activity to p53 cancer mutants.

  14. A Generalized Approach to Equational Unification.

    Science.gov (United States)

    1985-08-01

    Interpreter Working with Infinite Terms," Technical Report FCT/UNL - 20/82, Faculdade de Ciencias e Tecnologia , November 1982. Quinta da Torre, 2825...x. y , z. For readability, we will use the symbols, + *, and * as binary infix operators. Examples of terms are f(x, a), / (x - 0), and y + 1. Given a...z 2. x . y = y ex. The integer operations of plus and times are only two of the many examples of associative and commutative functions about which we

  15. Introduction Of Computational Materials Science

    International Nuclear Information System (INIS)

    Lee, Jun Geun

    2006-08-01

    This book gives, descriptions of computer simulation, computational materials science, typical three ways of computational materials science, empirical methods ; molecular dynamics such as potential energy, Newton's equation of motion, data production and analysis of results, quantum mechanical methods like wave equation, approximation, Hartree method, and density functional theory, dealing of solid such as pseudopotential method, tight-binding methods embedded atom method, Car-Parrinello method and combination simulation.

  16. Computational models of neuromodulation.

    Science.gov (United States)

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  17. Computer Programming Education with Miranda

    NARCIS (Netherlands)

    Joosten, S.M.M.; van den Berg, Klaas

    During the past four years, an experiment has been carried out with an introductory course in computer programming, based on functional programming. This article describes the background of this approach, the aim of the computer programming course, the outline and subject matter of the course parts

  18. Railgun bore material test results

    International Nuclear Information System (INIS)

    Wang, S.Y.; Burton, R.L.; Witherspoon, F.D.; Bloomberg, H.W.; Goldstein, S.A.; Tidman, D.A.; Winsor, N.K.

    1987-01-01

    GT-Devices, Inc. has constructed a material test facility (MTF) to study the fundamental heat transfer problem of both railgun and electrothermal guns, and to test candidate gun materials under real plasma conditions. The MTF electrothermally produces gigawatt-level plasmas with pulse lengths of 10-30 microseconds. Circular bore and non-circular bore test barrels have been successfully operated under a wide range of simulated heating environments for EM launchers. Diagnostics include piezoelectric MHz pressure probes, time-of-flight probes, and current and voltage probes. Ablation measurements are accomplished by weighing and optical inspection, including borescope, optical microscope, and scanning electron microscope (SEM). From these measurements the ablation threshold for both the rail and insulator materials can be determined as a function of plasma heating. The MTF diagnostics are supported by an unsteady 1-D model of MTF which uses the flux-corrected transport (FCT) algorithm to calculate the fluid equations in conservative form. A major advantage of the FCT algorithm is that it can model gas dynamic shock behaviour without the requirement of numerical diffusion. The principle use of the code is to predict the material surface temperature ΔT/α from the unsteady heat transfer q(t)

  19. Theory and computation of spheroidal wavefunctions

    International Nuclear Information System (INIS)

    Falloon, P E; Abbott, P C; Wang, J B

    2003-01-01

    In this paper we report on a package, written in the Mathematica computer algebra system, which has been developed to compute the spheroidal wavefunctions of Meixner and Schaefke (1954 Mathieusche Funktionen und Sphaeroidfunktionen) and is available online (physics.uwa.edu.au/~falloon/spheroidal/spheroidal.html). This package represents a substantial contribution to the existing software, since it computes the spheroidal wavefunctions to arbitrary precision for general complex parameters μ, ν, γ and argument z; existing software can only handle integer μ, ν and does not give arbitrary precision. The package also incorporates various special cases and computes analytic power series and asymptotic expansions in the parameter γ. The spheroidal wavefunctions of Flammer (1957 Spheroidal Wave functions) are included as a special case of Meixner's more general functions. This paper presents a concise review of the general theory of spheroidal wavefunctions and a description of the formulae and algorithms used in their computation, and gives high precision numerical examples

  20. Computing correct truncated excited state wavefunctions

    Science.gov (United States)

    Bacalis, N. C.; Xiong, Z.; Zang, J.; Karaoulanis, D.

    2016-12-01

    We demonstrate that, if a wave function's truncated expansion is small, then the standard excited states computational method, of optimizing one "root" of a secular equation, may lead to an incorrect wave function - despite the correct energy according to the theorem of Hylleraas, Undheim and McDonald - whereas our proposed method [J. Comput. Meth. Sci. Eng. 8, 277 (2008)] (independent of orthogonality to lower lying approximants) leads to correct reliable small truncated wave functions. The demonstration is done in He excited states, using truncated series expansions in Hylleraas coordinates, as well as standard configuration-interaction truncated expansions.

  1. Computer aided optimum design of rubble-mound breakwater cross-sections : Manual of the RUMBA computer package, release 1

    NARCIS (Netherlands)

    De Haan, W.

    1989-01-01

    The computation of the optimum rubble-mound breakwater crosssection is executed on a micro-computer. The RUMBA computer package consists of two main parts: the optimization process is executed by a Turbo Pascal programme, the second part consists of editing functions written in AutoLISP. AutoLISP is

  2. Computing Z-top

    International Nuclear Information System (INIS)

    Kashani-Poor, A.K.

    2014-01-01

    The topological string presents an arena in which many features of string theory proper, such as the interplay between world-sheet and target space descriptions or open-closed duality, can be distilled into computational techniques which yield results beyond perturbation theory. In this thesis, I will summarize my research activity in this area. The presentation is organized around computations of the topological string partition function Z-top based on various perspectives on the topological string. (author)

  3. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  4. Secure computing on reconfigurable systems

    OpenAIRE

    Fernandes Chaves, R.J.

    2007-01-01

    This thesis proposes a Secure Computing Module (SCM) for reconfigurable computing systems. SC provides a protected and reliable computational environment, where data security and protection against malicious attacks to the system is assured. SC is strongly based on encryption algorithms and on the attestation of the executed functions. The use of SC on reconfigurable devices has the advantage of being highly adaptable to the application and the user requirements, while providing high performa...

  5. Application of invariant plane strain (IPS) theory to {gamma} hydride formation in dilute Zr-Nb alloys

    Energy Technology Data Exchange (ETDEWEB)

    Srivastava, D. [Materials Science Division, Bhabha Atomic Research Centre, Mumbai 400085, Maharashtra (India)]. E-mail: dsrivastavabarc@yahoo.co.in; Neogy, S. [Materials Science Division, Bhabha Atomic Research Centre, Mumbai 400085, Maharashtra (India); Dey, G.K. [Materials Science Division, Bhabha Atomic Research Centre, Mumbai 400085, Maharashtra (India); Banerjee, S. [Materials Science Division, Bhabha Atomic Research Centre, Mumbai 400085, Maharashtra (India); Ranganathan, S. [Materials Science Division, Bhabha Atomic Research Centre, Mumbai 400085, Maharashtra (India)

    2005-04-25

    The crystallographic aspects associated with the formation of the {gamma} hydride phase (fct) from the {alpha} (hcp) phase and the {beta} (bcc) phase in Zr-Nb alloys have been studied in two distinct situations, viz., in the {alpha} matrix in pure Zr and Zr-2.5Nb and in the {beta} matrix in {beta} stabilized Zr-20Nb alloy. The {beta}-{gamma} formation can be treated primarily as a simple shear on the basal plane involving a change in the stacking sequence. A possible mechanism for {alpha}-{gamma} transformation has been presented in this paper. In this paper the {beta}->{gamma} transformation has been considered in terms of the invariant plane strain theory (IPS) in order to predict the crystallographic features of the {gamma} hydride formed. The lattice invariant shear (LIS) (110){sub {beta}}[1-bar 10]{sub {beta}}||(111){sub {gamma}}[12-bar 1]{sub {gamma}} has been considered and the crystallographic parameters associated with bcc->fct transformation, such as the habit plane and the magnitude of the LIS and the shape strain have been computed. The predictions made in the present analysis have been compared with experimentally observed habit planes. The {alpha}/{gamma} and {beta}/{gamma} interface has been examined by high resolution transmission electron microscopy (HRTEM) technique to compare with the interfaces observed in martensitic transformations.

  6. Computer-based quantitative computed tomography image analysis in idiopathic pulmonary fibrosis: A mini review.

    Science.gov (United States)

    Ohkubo, Hirotsugu; Nakagawa, Hiroaki; Niimi, Akio

    2018-01-01

    Idiopathic pulmonary fibrosis (IPF) is the most common type of progressive idiopathic interstitial pneumonia in adults. Many computer-based image analysis methods of chest computed tomography (CT) used in patients with IPF include the mean CT value of the whole lungs, density histogram analysis, density mask technique, and texture classification methods. Most of these methods offer good assessment of pulmonary functions, disease progression, and mortality. Each method has merits that can be used in clinical practice. One of the texture classification methods is reported to be superior to visual CT scoring by radiologist for correlation with pulmonary function and prediction of mortality. In this mini review, we summarize the current literature on computer-based CT image analysis of IPF and discuss its limitations and several future directions. Copyright © 2017 The Japanese Respiratory Society. Published by Elsevier B.V. All rights reserved.

  7. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Science.gov (United States)

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Requirements for SSC central computing staffing (conceptual)

    International Nuclear Information System (INIS)

    Pfister, J.

    1985-01-01

    Given a computation center with --10,000 MIPS supporting --1,000 users, what are the staffing requirements? The attempt in this paper is to list the functions and staff size required in a central computing or centrally supported computing complex. The organization assumes that although considerable computing power would exist (mostly for online) in the four interaction regions (IR) that there are functions/capabilities better performed outside the IR and in this model at a ''central computing facility.'' What follows is one staffing approach, not necessarily optimal, with certain assumptions about numbers of computer systems, media, networks and system controls, that is, one would get the best technology available. Thus, it is speculation about what the technology may bring and what it takes to operate it. From an end user support standpoint it is less clear, given the geography of an SSC, where and what the consulting support should look like and its location

  9. Chronic hypersensitivity pneumonitis: high resolution computed tomography patterns and pulmonary function indices as prognostic determinants

    International Nuclear Information System (INIS)

    Walsh, Simon L.F.; Devaraj, Anand; Hansell, David M.; Sverzellati, Nicola; Wells, Athol U.

    2012-01-01

    To investigate high resolution computed tomography (HRCT) and pulmonary function indices (PFTs) for determining prognosis in patients with chronic fibrotic hypersensitivity pneumonitis (CHP). Case records, PFTs (FEV 1 , FVC and DLco) and HRCTs of ninety-two patients with chronic hypersensitivity pneumonitis were evaluated. HRCT studies were scored by two observers for total disease extent, ground-glass opacification, fine and coarse reticulation, microcystic and macrocystic honeycombing, centrilobular emphysema and consolidation. Traction bronchiectasis within each pattern was graded. Using Cox proportional hazards regression models the prognostic strength of individual HRCT patterns and pulmonary function test variables were determined. There were forty two deaths during the study period. Increasing severity of traction bronchiectasis was the strongest predictor of mortality (HR 1.10, P < 0.001, 95%CI 1.04-1.16). Increasing global interstitial disease extent (HR 1.02, P = 0.02, 95%CI 1.00-1.03), microcystic honeycombing (HR 1.09, P = 0.019, 95%CI 1.01-1.17) and macrocystic honeycombing (HR 1.06, P < 0.01, 95%CI 1.01-1.10) were also independent predictors of mortality. In contrast, no individual PFT variable was predictive of mortality once HRCT patterns were accounted for. HRCT patterns, in particular, severity of traction bronchiectasis and extent of honeycombing are superior to pulmonary function tests for predicting mortality in patients with CHP. (orig.)

  10. Chronic hypersensitivity pneumonitis: high resolution computed tomography patterns and pulmonary function indices as prognostic determinants

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Simon L.F.; Devaraj, Anand; Hansell, David M. [Royal Brompton Hospital, Department of Radiology, London (United Kingdom); Sverzellati, Nicola [University of Parma, Department of Clinical Sciences, Section of Radiology, Parma (Italy); Wells, Athol U. [Royal Brompton Hospital, Interstitial Lung Diseases Unit, London (United Kingdom)

    2012-08-15

    To investigate high resolution computed tomography (HRCT) and pulmonary function indices (PFTs) for determining prognosis in patients with chronic fibrotic hypersensitivity pneumonitis (CHP). Case records, PFTs (FEV{sub 1}, FVC and DLco) and HRCTs of ninety-two patients with chronic hypersensitivity pneumonitis were evaluated. HRCT studies were scored by two observers for total disease extent, ground-glass opacification, fine and coarse reticulation, microcystic and macrocystic honeycombing, centrilobular emphysema and consolidation. Traction bronchiectasis within each pattern was graded. Using Cox proportional hazards regression models the prognostic strength of individual HRCT patterns and pulmonary function test variables were determined. There were forty two deaths during the study period. Increasing severity of traction bronchiectasis was the strongest predictor of mortality (HR 1.10, P < 0.001, 95%CI 1.04-1.16). Increasing global interstitial disease extent (HR 1.02, P = 0.02, 95%CI 1.00-1.03), microcystic honeycombing (HR 1.09, P = 0.019, 95%CI 1.01-1.17) and macrocystic honeycombing (HR 1.06, P < 0.01, 95%CI 1.01-1.10) were also independent predictors of mortality. In contrast, no individual PFT variable was predictive of mortality once HRCT patterns were accounted for. HRCT patterns, in particular, severity of traction bronchiectasis and extent of honeycombing are superior to pulmonary function tests for predicting mortality in patients with CHP. (orig.)

  11. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  12. Management Needs for Computer Support.

    Science.gov (United States)

    Irby, Alice J.

    University management has many and varied needs for effective computer services in support of their processing and information functions. The challenge for the computer center managers is to better understand these needs and assist in the development of effective and timely solutions. Management needs can range from accounting and payroll to…

  13. Sensitivity to Social Contingency in Adults with High-Functioning Autism during Computer-Mediated Embodied Interaction.

    Science.gov (United States)

    Zapata-Fonseca, Leonardo; Froese, Tom; Schilbach, Leonhard; Vogeley, Kai; Timmermans, Bert

    2018-02-08

    Autism Spectrum Disorder (ASD) can be understood as a social interaction disorder. This makes the emerging "second-person approach" to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA), using the minimalistic human-computer interface paradigm known as "perceptual crossing" (PC). We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other's responsiveness to one's own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.

  14. Sensitivity to Social Contingency in Adults with High-Functioning Autism during Computer-Mediated Embodied Interaction

    Directory of Open Access Journals (Sweden)

    Leonardo Zapata-Fonseca

    2018-02-01

    Full Text Available Autism Spectrum Disorder (ASD can be understood as a social interaction disorder. This makes the emerging “second-person approach” to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA, using the minimalistic human-computer interface paradigm known as “perceptual crossing” (PC. We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other’s responsiveness to one’s own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.

  15. Computer-aided system design

    Science.gov (United States)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  16. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  17. Optimising Job-Shop Functions Utilising the Score-Function Method

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2000-01-01

    During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging to this ......During the last 1-2 decades, simulation optimisation of discrete event dynamic systems (DEDS) has made considerable theoretical progress with respect to computational efficiency. The score-function (SF) method and the infinitesimal perturbation analysis (IPA) are two candidates belonging...... of a Job-Shop can be handled by the SF method....

  18. Computer-assisted spectral design and synthesis

    Science.gov (United States)

    Vadakkumpadan, Fijoy; Wang, Qiqi; Sun, Yinlong

    2005-01-01

    In this paper, we propose a computer-assisted approach for spectral design and synthesis. This approach starts with some initial spectrum, modifies it interactively, evaluates the change, and decides the optimal spectrum. Given a requested change as function of wavelength, we model the change function using a Gaussian function. When there is the metameric constraint, from the Gaussian function of request change, we propose a method to generate the change function such that the result spectrum has the same color as the initial spectrum. We have tested the proposed method with different initial spectra and change functions, and implemented an interactive graphics environment for spectral design and synthesis. The proposed approach and graphics implementation for spectral design and synthesis can be helpful for a number of applications such as lighting of building interiors, textile coloration, and pigment development of automobile paints, and spectral computer graphics.

  19. Machine Learning Classification to Identify the Stage of Brain-Computer Interface Therapy for Stroke Rehabilitation Using Functional Connectivity

    Directory of Open Access Journals (Sweden)

    Rosaleena Mohanty

    2018-05-01

    Full Text Available Interventional therapy using brain-computer interface (BCI technology has shown promise in facilitating motor recovery in stroke survivors; however, the impact of this form of intervention on functional networks outside of the motor network specifically is not well-understood. Here, we investigated resting-state functional connectivity (rs-FC in stroke participants undergoing BCI therapy across stages, namely pre- and post-intervention, to identify discriminative functional changes using a machine learning classifier with the goal of categorizing participants into one of the two therapy stages. Twenty chronic stroke participants with persistent upper-extremity motor impairment received neuromodulatory training using a closed-loop neurofeedback BCI device, and rs-functional MRI (rs-fMRI scans were collected at four time points: pre-, mid-, post-, and 1 month post-therapy. To evaluate the peak effects of this intervention, rs-FC was analyzed from two specific stages, namely pre- and post-therapy. In total, 236 seeds spanning both motor and non-motor regions of the brain were computed at each stage. A univariate feature selection was applied to reduce the number of features followed by a principal component-based data transformation used by a linear binary support vector machine (SVM classifier to classify each participant into a therapy stage. The SVM classifier achieved a cross-validation accuracy of 92.5% using a leave-one-out method. Outside of the motor network, seeds from the fronto-parietal task control, default mode, subcortical, and visual networks emerged as important contributors to the classification. Furthermore, a higher number of functional changes were observed to be strengthening from the pre- to post-therapy stage than the ones weakening, both of which involved motor and non-motor regions of the brain. These findings may provide new evidence to support the potential clinical utility of BCI therapy as a form of stroke

  20. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...