WorldWideScience

Sample records for fct functional computed

  1. FCT (functional computed tomography) evaluation of the lung volumes at different PEEP (positive-end expiratory pressure) ventilation pattern, in mechanical ventilated patients

    Papi, M.G.; Di Segni, R.; Mazzetti, G.; Staffa, F.; Conforto, F.; Calimici, R.; Salvi, A.; Matteucci, G.

    2007-01-01

    Purpose To evaluate with FCT (functional computed tomography) total lung volume and fractional lung volumes at different PEEP (positive end expiratory pressure) values in acute mechanically ventilated patients. Methods Nine ICU (intensive care unity) patients (1 lung pneumonia, 2 polytrauma, 2 sepsis, 3 brain surgery, 1 pulmonary embolism); mean age 48 ± 15 years, 6 male, 3 female; GE 16 MDCT scan was performed with acquisition from apex to diaphragma in seven seca at different PEEP values. Raw CT data were analysed by an advantage workstation to obtain volume density masks and histograms of both lungs and each lung and these density ranges were applied: - 1000 - 950 hyper-ventilated lung, -900 - 650 well aerated lung, -950 - 500 all aerated lung, -500 + 200 lung tissue. Total and fractional lung volumes, Hounsfield unit (HU) were calculated and compared at different PEEP values (0, 5, 10, 15 cm H 2 O). In four patients lung volumes were compared between the more and the less involved lung at increased PEEP. Statistic analysis: comparison means-medians tests. Results Data calculated at five PEEP showed unexpected decrease of total lung volume and increase of lung density (HU); proportionally no significant improvement of oxigenation. (orig.)

  2. FCT (functional computed tomography) evaluation of the lung volumes at different PEEP (positive-end expiratory pressure) ventilation pattern, in mechanical ventilated patients

    Papi, M.G.; Di Segni, R.; Mazzetti, G.; Staffa, F. [Dept. of Radiology, S. Giovanni HS, Rome (Italy); Conforto, F.; Calimici, R.; Salvi, A. [Dept. of Anesthesiology, S. Giovanni HS, Rome (Italy); Matteucci, G. [Dept. of Pneumology, S. Giovanni HS, Rome (Italy)

    2007-06-15

    Purpose To evaluate with FCT (functional computed tomography) total lung volume and fractional lung volumes at different PEEP (positive end expiratory pressure) values in acute mechanically ventilated patients. Methods Nine ICU (intensive care unity) patients (1 lung pneumonia, 2 polytrauma, 2 sepsis, 3 brain surgery, 1 pulmonary embolism); mean age 48 {+-} 15 years, 6 male, 3 female; GE 16 MDCT scan was performed with acquisition from apex to diaphragma in seven seca at different PEEP values. Raw CT data were analysed by an advantage workstation to obtain volume density masks and histograms of both lungs and each lung and these density ranges were applied: - 1000 - 950 = hyper-ventilated lung, -900 - 650 well aerated lung, -950 - 500 all aerated lung, -500 + 200 lung tissue. Total and fractional lung volumes, Hounsfield unit (HU) were calculated and compared at different PEEP values (0, 5, 10, 15 cm H{sub 2}O). In four patients lung volumes were compared between the more and the less involved lung at increased PEEP. Statistic analysis: comparison means-medians tests. Results Data calculated at five PEEP showed unexpected decrease of total lung volume and increase of lung density (HU); proportionally no significant improvement of oxigenation. (orig.)

  3. Numerical computation of FCT equilibria by inverse equilibrium method

    Tokuda, Shinji; Tsunematsu, Toshihide; Takeda, Tatsuoki

    1986-11-01

    FCT (Flux Conserving Tokamak) equilibria were obtained numerically by the inverse equilibrium method. The high-beta tokamak ordering was used to get the explicit boundary conditions for FCT equilibria. The partial differential equation was reduced to the simultaneous quasi-linear ordinary differential equations by using the moment method. The regularity conditions for solutions at the singular point of the equations can be expressed correctly by this reduction and the problem to be solved becomes a tractable boundary value problem on the quasi-linear ordinary differential equations. This boundary value problem was solved by the method of quasi-linearization, one of the shooting methods. Test calculations show that this method provides high-beta tokamak equilibria with sufficiently high accuracy for MHD stability analysis. (author)

  4. A systematic review of Functional Communication Training (FCT) interventions involving augmentative and alternative communication in school settings.

    Walker, Virginia L; Lyon, Kristin J; Loman, Sheldon L; Sennott, Samuel

    2018-06-01

    The purpose of this meta-analysis was to summarize single-case intervention studies in which Functional Communication Training (FCT) involving augmentative and alternative communication (AAC) was implemented in school settings. Overall, the findings suggest that FCT involving AAC was effective in reducing challenging behaviour and promoting aided or unaided AAC use among participants with disability. FCT was more effective for the participants who engaged in less severe forms of challenging behaviour prior to intervention. Additionally, FCT was more effective when informed by a descriptive functional behaviour assessment and delivered within inclusive school settings. Implications for practice and directions for future research related to FCT for students who use AAC are addressed.

  5. Assembly mechanism of FCT region type 1 pili in serotype M6 Streptococcus pyogenes.

    Nakata, Masanobu; Kimura, Keiji Richard; Sumitomo, Tomoko; Wada, Satoshi; Sugauchi, Akinari; Oiki, Eiji; Higashino, Miharu; Kreikemeyer, Bernd; Podbielski, Andreas; Okahashi, Nobuo; Hamada, Shigeyuki; Isoda, Ryutaro; Terao, Yutaka; Kawabata, Shigetada

    2011-10-28

    The human pathogen Streptococcus pyogenes produces diverse pili depending on the serotype. We investigated the assembly mechanism of FCT type 1 pili in a serotype M6 strain. The pili were found to be assembled from two precursor proteins, the backbone protein T6 and ancillary protein FctX, and anchored to the cell wall in a manner that requires both a housekeeping sortase enzyme (SrtA) and pilus-associated sortase enzyme (SrtB). SrtB is primarily required for efficient formation of the T6 and FctX complex and subsequent polymerization of T6, whereas proper anchoring of the pili to the cell wall is mainly mediated by SrtA. Because motifs essential for polymerization of pilus backbone proteins in other Gram-positive bacteria are not present in T6, we sought to identify the functional residues involved in this process. Our results showed that T6 encompasses the novel VAKS pilin motif conserved in streptococcal T6 homologues and that the lysine residue (Lys-175) within the motif and cell wall sorting signal of T6 are prerequisites for isopeptide linkage of T6 molecules. Because Lys-175 and the cell wall sorting signal of FctX are indispensable for substantial incorporation of FctX into the T6 pilus shaft, FctX is suggested to be located at the pilus tip, which was also implied by immunogold electron microscopy findings. Thus, the elaborate assembly of FCT type 1 pili is potentially organized by sortase-mediated cross-linking between sorting signals and the amino group of Lys-175 positioned in the VAKS motif of T6, thereby displaying T6 and FctX in a temporospatial manner.

  6. Purification, crystallization and preliminary crystallographic analysis of the minor pilin FctB from Streptococcus pyogenes

    Linke, Christian; Young, Paul G.; Kang, Hae Joo; Proft, Thomas; Baker, Edward N.

    2010-01-01

    The minor pilin FctB from S. pyogenes strain 90/306S was expressed in E. coli, purified and crystallized. The hexagonal FctB crystals diffracted to 2.9 Å resolution. The minor pilin FctB is an integral part of the pilus assembly expressed by Streptococcus pyogenes. Since it is located at the cell wall, it can be hypothesized that it functions as a cell-wall anchor for the streptococcal pilus. In order to elucidate its structure, the genes for FctB from the S. pyogenes strains 90/306S and SF370 were cloned for overexpression in Escherichia coli. FctB from strain 90/306S was crystallized by the sitting-drop vapour-diffusion method using sodium citrate as a precipitant. The hexagonal FctB crystals belonged to space group P6 1 or P6 5 , with unit-cell parameters a = b = 95.15, c = 100.25 Å, and diffracted to 2.9 Å resolution

  7. Effects of Functional Communication Training (FCT) on the Communicative, Self-Initiated Toileting Behavior for Students with Developmental Disabilities in a School Setting

    Kim, Jinnie

    2012-01-01

    Far less is known about the effects of functional communication-based toileting interventions for students with developmental disabilities in a school setting. Furthermore, the currently available toileting interventions for students with disabilities include some undesirable procedures such as the use of punishment, unnatural clinic/university…

  8. 18 December 2012 -Portuguese President of FCT M. Seabra visiting the Computing Centre with IT Department Head F. Hemmer, ATLAS experimental area with Collaboration Spokesperson F. Gianotti and A. Henriques Correia, in the LHC tunnel at Point 2 and CMS experimental area with Deputy Spokesperson J. Varela, signing an administrative agreement with Director-General R. Heuer; LIP President J. M. Gago and Delegate to CERN Council G. Barreia present.

    Samuel Morier-Genoud

    2012-01-01

    18 December 2012 -Portuguese President of FCT M. Seabra visiting the Computing Centre with IT Department Head F. Hemmer, ATLAS experimental area with Collaboration Spokesperson F. Gianotti and A. Henriques Correia, in the LHC tunnel at Point 2 and CMS experimental area with Deputy Spokesperson J. Varela, signing an administrative agreement with Director-General R. Heuer; LIP President J. M. Gago and Delegate to CERN Council G. Barreia present.

  9. Magnetoacoustic heating and FCT-equilibria in the belt pinch

    Erckmann, V.

    1983-02-01

    In the HECTOR belt pinch of high β plasma is produced by magnetic compression in a Tokamak geometry. After compresseion the initial β value can be varied between 0.2 and 0.8. During 5 μs the plasma is further heated by a fast magnetoacoustic wave with a frequency near the first harmonic of the ion cyclotronfrequency. For the first time the β-value of a pinch plasma could be increased further from 0.34 after compression to 0.46 at the end of the rf-heating cycle. By proper selection of the final β-value the region for resonance absorption of the heating wave can be shifted. Strong heating (200 MW) has been observed in the cases, where the resonance region has been located in the center of the plasma. In deuterium discharges an increase in ion temperature is observed during the heating process, whereas the electrons are energetically decoupled, showing no temperature increase. Strong plasma losses are found in the 200 MW range after the rf-heating process. The dominant mechanisms are charge exchange collisions with neutral gas atoms. During rf-heating and the subsequent cooling phase the magnetic flux is frozen due to the high conductivity of the plasma. The observed equilibria could be identified as flux conserving Tokamak (FCT) equilibria. Based on a two-dimensional code the time-evolution of the equilibria has been calculated. The q-profiles are time-independent, with increasing β the magnetic axis of the plasma is shifted towards the outer boundary of the torus, and finally the linear relation between β and βsub(pol), which is characteristic for low-β-equilibria, is no longer valid. Thus for the first time the existence of FCT-equilibria at high β has been demonstrated experimentally together with a qualitative agreement with FCT-theory. (orig./AH) [de

  10. Software For Computing Selected Functions

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  11. Computing the functional proteome

    O'Brien, Edward J.; Palsson, Bernhard

    2015-01-01

    Constraint-based models enable the computation of feasible, optimal, and realized biological phenotypes from reaction network reconstructions and constraints on their operation. To date, stoichiometric reconstructions have largely focused on metabolism, resulting in genome-scale metabolic models (M...

  12. in local and foreign brands of lipsticks in fct, abuja, nigeria 318

    userpc

    This study determined toxic heavy metal concentration in Local and Foreign brands of lipsticks sold in FCT ... (10) Local and ten (10) Foreign brands for Lead using flame atomic absorption .... study, Flame Atomic Absorption Spectrometric.

  13. Computational Methods and Function Theory

    Saff, Edward; Salinas, Luis; Varga, Richard

    1990-01-01

    The volume is devoted to the interaction of modern scientific computation and classical function theory. Many problems in pure and more applied function theory can be tackled using modern computing facilities: numerically as well as in the sense of computer algebra. On the other hand, computer algorithms are often based on complex function theory, and dedicated research on their theoretical foundations can lead to great enhancements in performance. The contributions - original research articles, a survey and a collection of problems - cover a broad range of such problems.

  14. Functional programming for computer vision

    Breuel, Thomas M.

    1992-04-01

    Functional programming is a style of programming that avoids the use of side effects (like assignment) and uses functions as first class data objects. Compared with imperative programs, functional programs can be parallelized better, and provide better encapsulation, type checking, and abstractions. This is important for building and integrating large vision software systems. In the past, efficiency has been an obstacle to the application of functional programming techniques in computationally intensive areas such as computer vision. We discuss and evaluate several 'functional' data structures for representing efficiently data structures and objects common in computer vision. In particular, we will address: automatic storage allocation and reclamation issues; abstraction of control structures; efficient sequential update of large data structures; representing images as functions; and object-oriented programming. Our experience suggests that functional techniques are feasible for high- performance vision systems, and that a functional approach simplifies the implementation and integration of vision systems greatly. Examples in C++ and SML are given.

  15. Automatic computation of transfer functions

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  16. Assessment Report Sandia National Laboratories Fuel Cycle Technologies Quality Assurance Evaluation of FY15 SNL FCT M2 Milestone Deliverables

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-05-01

    Sandia National Laboratories (SNL) Fuel Cycle Technologies (FCT) program activities are conducted in accordance with FCT Quality Assurance Program Document (FCT-QAPD) requirements. The FCT-QAPD interfaces with SNL approved Quality Assurance Program Description (SNL-QAPD) as explained in the Sandia National Laboratories QA Program Interface Document for FCT Activities (Interface Document). This plan describes SNL's FY16 assessment of SNL's FY15 FCT M2 milestone deliverable's compliance with program QA requirements, including SNL R&A requirements. The assessment is intended to confirm that SNL's FY15 milestone deliverables contain the appropriate authenticated review documentation and that there is a copy marked with SNL R&A numbers.

  17. Assessment Report Sandia National Laboratories Fuel Cycle Technologies Quality Assurance Evaluation of FY15 SNL FCT M2 Milestone Deliverables

    Appel, Gordon John

    2016-01-01

    Sandia National Laboratories (SNL) Fuel Cycle Technologies (FCT) program activities are conducted in accordance with FCT Quality Assurance Program Document (FCT-QAPD) requirements. The FCT-QAPD interfaces with SNL approved Quality Assurance Program Description (SNL-QAPD) as explained in the Sandia National Laboratories QA Program Interface Document for FCT Activities (Interface Document). This plan describes SNL's FY16 assessment of SNL's FY15 FCT M2 milestone deliverable's compliance with program QA requirements, including SNL R&A requirements. The assessment is intended to confirm that SNL's FY15 milestone deliverables contain the appropriate authenticated review documentation and that there is a copy marked with SNL R&A numbers.

  18. Functional Programming in Computer Science

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  19. Computation of hyperspherical Bessel functions

    Tram, Thomas

    2013-01-01

    In this paper we present a fast and accurate numerical algorithm for the computation of hyperspherical Bessel functions of large order and real arguments. For the hyperspherical Bessel functions of closed type, no stable algorithm existed so far due to the lack of a backwards recurrence. We solved this problem by establishing a relation to Gegenbauer polynomials. All our algorithms are written in C and are publicly available at Github [https://github.com/lesgourg/class_public]. A Python wrapp...

  20. Deterministic computation of functional integrals

    Lobanov, Yu.Yu.

    1995-09-01

    A new method of numerical integration in functional spaces is described. This method is based on the rigorous definition of a functional integral in complete separable metric space and on the use of approximation formulas which we constructed for this kind of integral. The method is applicable to solution of some partial differential equations and to calculation of various characteristics in quantum physics. No preliminary discretization of space and time is required in this method, as well as no simplifying assumptions like semi-classical, mean field approximations, collective excitations, introduction of ''short-time'' propagators, etc are necessary in our approach. The constructed approximation formulas satisfy the condition of being exact on a given class of functionals, namely polynomial functionals of a given degree. The employment of these formulas replaces the evaluation of a functional integral by computation of the ''ordinary'' (Riemannian) integral of a low dimension, thus allowing to use the more preferable deterministic algorithms (normally - Gaussian quadratures) in computations rather than traditional stochastic (Monte Carlo) methods which are commonly used for solution of the problem under consideration. The results of application of the method to computation of the Green function of the Schroedinger equation in imaginary time as well as the study of some models of Euclidean quantum mechanics are presented. The comparison with results of other authors shows that our method gives significant (by an order of magnitude) economy of computer time and memory versus other known methods while providing the results with the same or better accuracy. The funcitonal measure of the Gaussian type is considered and some of its particular cases, namely conditional Wiener measure in quantum statistical mechanics and functional measure in a Schwartz distribution space in two-dimensional quantum field theory are studied in detail. Numerical examples demonstrating the

  1. FCT: a fully-distributed context-aware trust model for location based service recommendation

    Zhiquan LIU; Jianfeng MA; Zhongyuan JIANG; Yinbin MIAO

    2017-01-01

    With the popularity of location based service (LBS),a vast number of trust medels for LBS recommendation (LBSR) have been proposed.These trust models are centralized in essence,and the trusted third party may collude with malicious service providers or cause the single-point failure problem.This work improves the classic certified reputation (CR) model and proposes a novel fully-distributed context-aware trust (FCT) model for LBSR.Recommendation operations are conducted by service providers directly and the trusted third party is no longer required in our FCT model.Besides,our FCT model also supports the movements of service providers due to its self-certified characteristic.Moreover,for easing the collusion attack and value imbalance attack,we comprehensively consider four kinds of factor weights,namely number,time decay,preference and context weights.Finally,a fully-distributed service recommendation scenario is deployed,and comprehensive experiments and analysis are conducted.The results indicate that our FCT model significantly outperforms the CR model in terms of the robustness against the collusion attack and value imbalance attack,as well as the service recommendation performance in improving the successful trading rates of honest service providers and reducing the risks of trading with malicious service providers.

  2. Accurate computation of Mathieu functions

    Bibby, Malcolm M

    2013-01-01

    This lecture presents a modern approach for the computation of Mathieu functions. These functions find application in boundary value analysis such as electromagnetic scattering from elliptic cylinders and flat strips, as well as the analogous acoustic and optical problems, and many other applications in science and engineering. The authors review the traditional approach used for these functions, show its limitations, and provide an alternative ""tuned"" approach enabling improved accuracy and convergence. The performance of this approach is investigated for a wide range of parameters and mach

  3. Computational complexity of Boolean functions

    Korshunov, Aleksei D [Sobolev Institute of Mathematics, Siberian Branch of the Russian Academy of Sciences, Novosibirsk (Russian Federation)

    2012-02-28

    Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.

  4. Normal Functions As A New Way Of Defining Computable Functions

    Leszek Dubiel

    2004-01-01

    Full Text Available Report sets new method of defining computable functions. This is formalization of traditional function descriptions, so it allows to define functions in very intuitive way. Discovery of Ackermann function proved that not all functions that can be easily computed can be so easily described with Hilbert’s system of recursive functions. Normal functions lack this disadvantage.

  5. Normal Functions as a New Way of Defining Computable Functions

    Leszek Dubiel

    2004-01-01

    Full Text Available Report sets new method of defining computable functions. This is formalization of traditional function descriptions, so it allows to define functions in very intuitive way. Discovery of Ackermann function proved that not all functions that can be easily computed can be so easily described with Hilbert's system of recursive functions. Normal functions lack this disadvantage.

  6. RATGRAPH: Computer Graphing of Rational Functions.

    Minch, Bradley A.

    1987-01-01

    Presents an easy-to-use Applesoft BASIC program that graphs rational functions and any asymptotes that the functions might have. Discusses the nature of rational functions, graphing them manually, employing a computer to graph rational functions, and describes how the program works. (TW)

  7. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    Soria-Hoyo, C; Castellanos, A [Departamento de Electronica y Electromagnetismo, Facultad de Fisica, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain); Pontiga, F [Departamento de Fisica Aplicada II, EUAT, Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: cshoyo@us.es

    2008-10-21

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  8. Two dimensional numerical simulation of gas discharges: comparison between particle-in-cell and FCT techniques

    Soria-Hoyo, C; Castellanos, A; Pontiga, F

    2008-01-01

    Two different numerical techniques have been applied to the numerical integration of equations modelling gas discharges: a finite-difference flux corrected transport (FD-FCT) technique and a particle-in-cell (PIC) technique. The PIC technique here implemented has been specifically designed for the simulation of 2D electrical discharges using cylindrical coordinates. The development and propagation of a streamer between two parallel electrodes has been used as a convenient test to compare the performance of both techniques. In particular, the phase velocity of the cathode directed streamer has been used to check the internal consistency of the numerical simulations. The results obtained from the two techniques are in reasonable agreement with each other, and both techniques have proved their ability to follow the high gradients of charge density and electric field present in this type of problems. Moreover, the streamer velocities predicted by the simulation are in accordance with the typical experimental values.

  9. Computational network design from functional specifications

    Peng, Chi Han; Yang, Yong Liang; Bao, Fan; Fink, Daniel; Yan, Dongming; Wonka, Peter; Mitra, Niloy J.

    2016-01-01

    of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications

  10. Computing the zeros of analytic functions

    Kravanja, Peter

    2000-01-01

    Computing all the zeros of an analytic function and their respective multiplicities, locating clusters of zeros and analytic fuctions, computing zeros and poles of meromorphic functions, and solving systems of analytic equations are problems in computational complex analysis that lead to a rich blend of mathematics and numerical analysis. This book treats these four problems in a unified way. It contains not only theoretical results (based on formal orthogonal polynomials or rational interpolation) but also numerical analysis and algorithmic aspects, implementation heuristics, and polished software (the package ZEAL) that is available via the CPC Program Library. Graduate studets and researchers in numerical mathematics will find this book very readable.

  11. Function Package for Computing Quantum Resource Measures

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  12. On computing special functions in marine engineering

    Constantinescu, E.; Bogdan, M.

    2015-11-01

    Important modeling applications in marine engineering conduct us to a special class of solutions for difficult differential equations with variable coefficients. In order to be able to solve and implement such models (in wave theory, in acoustics, in hydrodynamics, in electromagnetic waves, but also in many other engineering fields), it is necessary to compute so called special functions: Bessel functions, modified Bessel functions, spherical Bessel functions, Hankel functions. The aim of this paper is to develop numerical solutions in Matlab for the above mentioned special functions. Taking into account the main properties for Bessel and modified Bessel functions, we shortly present analytically solutions (where possible) in the form of series. Especially it is studied the behavior of these special functions using Matlab facilities: numerical solutions and plotting. Finally, it will be compared the behavior of the special functions and point out other directions for investigating properties of Bessel and spherical Bessel functions. The asymptotic forms of Bessel functions and modified Bessel functions allow determination of important properties of these functions. The modified Bessel functions tend to look more like decaying and growing exponentials.

  13. Computer assisted functional analysis. Computer gestuetzte funktionelle Analyse

    Schmidt, H A.E.; Roesler, H

    1982-01-01

    The latest developments in computer-assisted functional analysis (CFA) in nuclear medicine are presented in about 250 papers of the 19th international annual meeting of the Society of Nuclear Medicine (Bern, September 1981). Apart from the mathematical and instrumental aspects of CFA, computerized emission tomography is given particular attention. Advances in nuclear medical diagnosis in the fields of radiopharmaceuticals, cardiology, angiology, neurology, ophthalmology, pulmonology, gastroenterology, nephrology, endocrinology, oncology and osteology are discussed.

  14. Dynamics and computation in functional shifts

    Namikawa, Jun; Hashimoto, Takashi

    2004-07-01

    We introduce a new type of shift dynamics as an extended model of symbolic dynamics, and investigate the characteristics of shift spaces from the viewpoints of both dynamics and computation. This shift dynamics is called a functional shift, which is defined by a set of bi-infinite sequences of some functions on a set of symbols. To analyse the complexity of functional shifts, we measure them in terms of topological entropy, and locate their languages in the Chomsky hierarchy. Through this study, we argue that considering functional shifts from the viewpoints of both dynamics and computation gives us opposite results about the complexity of systems. We also describe a new class of shift spaces whose languages are not recursively enumerable.

  15. BLUES function method in computational physics

    Indekeu, Joseph O.; Müller-Nedebock, Kristian K.

    2018-04-01

    We introduce a computational method in physics that goes ‘beyond linear use of equation superposition’ (BLUES). A BLUES function is defined as a solution of a nonlinear differential equation (DE) with a delta source that is at the same time a Green’s function for a related linear DE. For an arbitrary source, the BLUES function can be used to construct an exact solution to the nonlinear DE with a different, but related source. Alternatively, the BLUES function can be used to construct an approximate piecewise analytical solution to the nonlinear DE with an arbitrary source. For this alternative use the related linear DE need not be known. The method is illustrated in a few examples using analytical calculations and numerical computations. Areas for further applications are suggested.

  16. Computing complex Airy functions by numerical quadrature

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2001-01-01

    textabstractIntegral representations are considered of solutions of the Airydifferential equation w''-z, w=0 for computing Airy functions for complex values of z.In a first method contour integral representations of the Airyfunctions are written as non-oscillating

  17. Computer Games Functioning as Motivation Stimulants

    Lin, Grace Hui Chin; Tsai, Tony Kung Wan; Chien, Paul Shih Chieh

    2011-01-01

    Numerous scholars have recommended computer games can function as influential motivation stimulants of English learning, showing benefits as learning tools (Clarke and Dede, 2007; Dede, 2009; Klopfer and Squire, 2009; Liu and Chu, 2010; Mitchell, Dede & Dunleavy, 2009). This study aimed to further test and verify the above suggestion,…

  18. Function Follows Performance in Evolutionary Computational Processing

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  19. HCP to FCT + precipitate transformations in lamellar gamma-titanium aluminide alloys

    Karadge, Mallikarjun Baburao

    Fully lamellar gamma-TiAl [alpha2(HCP) + gamma(FCT)] based alloys are potential structural materials for aerospace engine applications. Lamellar structure stabilization and additional strengthening mechanisms are major issues in the ongoing development of titanium aluminides due to the microstructural instability resulting from decomposition of the strengthening alpha 2 phase. This work addresses characterization of multi-component TiAl systems to identify the mechanism of lamellar structure refinement and assess the effects of light element additions (C and Si) on creep deformation behavior. Transmission electron microscopy studies directly confirmed for the first time that, fine lamellar structure is formed by the nucleation and growth of a large number of basal stacking faults on the 1/6 dislocations cross slipping repeatedly into and out of basal planes. This lamellar structure can be tailored by modifying jog heights through chemistry and thermal processing. alpha 2 → gamma transformation during heating (investigated by differential scanning calorimetry and X-ray diffraction) is a two step process involving the formation of a novel disordered FCC gamma' TiAl [with a(gamma') = c(gamma)] as an intermediate phase followed by ordering. Addition of carbon and silicon induced Ti2AlC H-type carbide precipitation inside the alpha2 lath and Ti 5(Al,Si)3 zeta-type silicide precipitation at the alpha 2/gamma interface. The H-carbides preserve alpha2/gamma type interfaces, while zeta-silicide precipitates restrict ledge growth and interfacial sliding enabling strong resistance to creep deformation.

  20. Computation of the Complex Probability Function

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  1. Functions of the computer management games

    Kočí, Josef

    2016-01-01

    This thesis discusses the possibilities of using managerial games, their purpose, meaning, functions and focuses specifically on the management computer games, how it differs from classic games and what are their advantages and disadvantages. The theoretical part of thesis is also focused on why are these games discussed, why are they accepted or sometimes rejected and why they have become so popular for some managers and public gamers. This will serve me a survey conducted in the 11 April 20...

  2. Versatile Density Functionals for Computational Surface Science

    Wellendorff, Jess

    Density functional theory (DFT) emerged almost 50 years ago. Since then DFT has established itself as the central electronic structure methodology for simulating atomicscale systems from a few atoms to a few hundred atoms. This success of DFT is due to a very favorable accuracy-to-computational c......Density functional theory (DFT) emerged almost 50 years ago. Since then DFT has established itself as the central electronic structure methodology for simulating atomicscale systems from a few atoms to a few hundred atoms. This success of DFT is due to a very favorable accuracy...... resampling techniques, thereby systematically avoiding problems with overfitting. The first ever density functional presenting both reliable accuracy and convincing error estimation is generated. The methodology is general enough to be applied to more complex functional forms with higher-dimensional fitting...

  3. New Computer Simulations of Macular Neural Functioning

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  4. Computer network defense through radial wave functions

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  5. Computational network design from functional specifications

    Peng, Chi Han

    2016-07-11

    Connectivity and layout of underlying networks largely determine agent behavior and usage in many environments. For example, transportation networks determine the flow of traffic in a neighborhood, whereas building floorplans determine the flow of people in a workspace. Designing such networks from scratch is challenging as even local network changes can have large global effects. We investigate how to computationally create networks starting from only high-level functional specifications. Such specifications can be in the form of network density, travel time versus network length, traffic type, destination location, etc. We propose an integer programming-based approach that guarantees that the resultant networks are valid by fulfilling all the specified hard constraints and that they score favorably in terms of the objective function. We evaluate our algorithm in two different design settings, street layout and floorplans to demonstrate that diverse networks can emerge purely from high-level functional specifications.

  6. Discrete Wigner functions and quantum computation

    Galvao, E.

    2005-01-01

    Full text: Gibbons et al. have recently defined a class of discrete Wigner functions W to represent quantum states in a finite Hilbert space dimension d. I characterize the set C d of states having non-negative W simultaneously in all definitions of W in this class. I then argue that states in this set behave classically in a well-defined computational sense. I show that one-qubit states in C 2 do not provide for universal computation in a recent model proposed by Bravyi and Kitaev [quant-ph/0403025]. More generally, I show that the only pure states in C d are stabilizer states, which have an efficient description using the stabilizer formalism. This result shows that two different notions of 'classical' states coincide: states with non-negative Wigner functions are those which have an efficient description. This suggests that negativity of W may be necessary for exponential speed-up in pure-state quantum computation. (author)

  7. Numerical computation of generalized importance functions

    Gomit, J.M.; Nasr, M.; Ngyuen van Chi, G.; Pasquet, J.P.; Planchard, J.

    1981-01-01

    Thus far, an important effort has been devoted to developing and applying generalized perturbation theory in reactor physics analysis. In this work we are interested in the calculation of the importance functions by the method of A. Gandini. We have noted that in this method the convergence of the iterative procedure adopted is not rapid. Hence to accelerate this convergence we have used the semi-iterative technique. Two computer codes have been developed for one and two dimensional calculations (SPHINX-1D and SPHINX-2D). The advantage of our calculation was confirmed by some comparative tests in which the iteration number and the computing time were highly reduced with respect to classical calculation (CIAP-1D and CIAP-2D). (orig.) [de

  8. Silicide induced surface defects in FePt nanoparticle fcc-to-fct thermally activated phase transition

    Chen, Shu; Lee, Stephen L.; André, Pascal

    2016-01-01

    Magnetic nanoparticles (MnPs) are relevant to a wide range of applications including high density information storage and magnetic resonance imaging to name but a few. Among the materials available to prepare MnPs, FePt is attracting growing attention. However, to harvest the strongest magnetic properties of FePt MnPs, a thermal annealing is often required to convert face-centered cubic as synthesized nPs into its tetragonal phase. Rarely addressed are the potential side effects of such treatments on the magnetic properties. In this study, we focus on the impact of silica shells often used in strategies aiming at overcoming MnP coalescence during the thermal annealing. While we show that this shell does prevent sintering, and that fcc-to-fct conversion does occur, we also reveal the formation of silicide, which can prevent the stronger magnetic properties of fct-FePt MnPs from being fully realised. This report therefore sheds lights on poorly investigated and understood interfacial phenomena occurring during the thermal annealing of MnPs and, by doing so, also highlights the benefits of developing new strategies to avoid silicide formation.

  9. New Method to Synthesize Highly Active and Durable Chemically Ordered fct-PtCo Cathode Catalyst for PEMFCs.

    Jung, Won Suk; Popov, Branko N

    2017-07-19

    In the bottom-up synthesis strategy performed in this study, the Co-catalyzed pyrolysis of chelate-complex and activated carbon black at high temperatures triggers the graphitization reaction which introduces Co particles in the N-doped graphitic carbon matrix and immobilizes N-modified active sites for the oxygen reduction reaction (ORR) on the carbon surface. In this study, the Co particles encapsulated within the N-doped graphitic carbon shell diffuse up to the Pt surface under the polymer protective layer and forms a chemically ordered face-centered tetragonal (fct) Pt-Co catalyst PtCo/CCCS catalyst as evidenced by structural and compositional studies. The fct-structured PtCo/CCCS at low-Pt loading (0.1 mg Pt cm -2 ) shows 6% higher power density than that of the state-of-the-art commercial Pt/C catalyst. After the MEA durability test of 30 000 potential cycles, the performance loss of the catalyst is negligible. The electrochemical surface area loss is less than 40%, while that of commercial Pt/C is nearly 80%. After the accelerated stress test, the uniform catalyst distribution is retained and the mean particle size increases approximate 1 nm. The results obtained in this study indicated that highly stable compositional and structural properties of chemically ordered PtCo/CCCS catalyst contribute to its exceptional catalyst durability.

  10. Silicide induced surface defects in FePt nanoparticle fcc-to-fct thermally activated phase transition

    Chen, Shu; Lee, Stephen L. [School of Physics and Astronomy, SUPA, University of St Andrews, St Andrews KY16 9SS (United Kingdom); André, Pascal, E-mail: pjpandre@riken.jp [School of Physics and Astronomy, SUPA, University of St Andrews, St Andrews KY16 9SS (United Kingdom); RIKEN, Wako 351-0198 (Japan); Department of Physics, CNRS-Ewha International Research Center (CERC), Ewha W. University, Seoul 120-750 (Korea, Republic of)

    2016-11-01

    Magnetic nanoparticles (MnPs) are relevant to a wide range of applications including high density information storage and magnetic resonance imaging to name but a few. Among the materials available to prepare MnPs, FePt is attracting growing attention. However, to harvest the strongest magnetic properties of FePt MnPs, a thermal annealing is often required to convert face-centered cubic as synthesized nPs into its tetragonal phase. Rarely addressed are the potential side effects of such treatments on the magnetic properties. In this study, we focus on the impact of silica shells often used in strategies aiming at overcoming MnP coalescence during the thermal annealing. While we show that this shell does prevent sintering, and that fcc-to-fct conversion does occur, we also reveal the formation of silicide, which can prevent the stronger magnetic properties of fct-FePt MnPs from being fully realised. This report therefore sheds lights on poorly investigated and understood interfacial phenomena occurring during the thermal annealing of MnPs and, by doing so, also highlights the benefits of developing new strategies to avoid silicide formation.

  11. International assessment of functional computer abilities

    Anderson, Ronald E.; Collis, Betty

    1993-01-01

    After delineating the major rationale for computer education, data are presented from Stage 1 of the IEA Computers in Education Study showing international comparisons that may reflect differential priorities. Rapid technological change and the lack of consensus on goals of computer education impedes the establishment of stable curricula for ¿general computer education¿ or computer literacy. In this context the construction of instruments for student assessment remains a challenge. Seeking to...

  12. Functional requirements for gas characterization system computer software

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  13. International assessment of functional computer abilities

    Anderson, Ronald E.; Collis, Betty

    1993-01-01

    After delineating the major rationale for computer education, data are presented from Stage 1 of the IEA Computers in Education Study showing international comparisons that may reflect differential priorities. Rapid technological change and the lack of consensus on goals of computer education

  14. A summary of numerical computation for special functions

    Zhang Shanjie

    1992-01-01

    In the paper, special functions frequently encountered in science and engineering calculations are introduced. The computation of the values of Bessel function and elliptic integrals are taken as the examples, and some common algorithms for computing most special functions, such as series expansion for small argument, asymptotic approximations for large argument, polynomial approximations, recurrence formulas and iteration method, are discussed. In addition, the determination of zeros of some special functions, and the other questions related to numerical computation are also discussed

  15. Functional Communication Training: A Contemporary Behavior Analytic Intervention for Problem Behaviors.

    Durand, V. Mark; Merges, Eileen

    2001-01-01

    This article describes functional communication training (FCT) with students who have autism. FCT involves teaching alternative communication strategies to replace problem behaviors. The article reviews the conditions under which this intervention is successful and compares the method with other behavioral approaches. It concludes that functional…

  16. Computer program for Bessel and Hankel functions

    Kreider, Kevin L.; Saule, Arthur V.; Rice, Edward J.; Clark, Bruce J.

    1991-01-01

    A set of FORTRAN subroutines for calculating Bessel and Hankel functions is presented. The routines calculate Bessel and Hankel functions of the first and second kinds, as well as their derivatives, for wide ranges of integer order and real or complex argument in single or double precision. Depending on the order and argument, one of three evaluation methods is used: the power series definition, an Airy function expansion, or an asymptotic expansion. Routines to calculate Airy functions and their derivatives are also included.

  17. Evaluating the Treatment Fidelity of Parents Who Conduct In-Home Functional Communication Training with Coaching via Telehealth

    Suess, Alyssa N.; Romani, Patrick W.; Wacker, David P.; Dyson, Shannon M.; Kuhle, Jennifer L.; Lee, John F.; Lindgren, Scott D.; Kopelman, Todd G.; Pelzel, Kelly E.; Waldron, Debra B.

    2014-01-01

    We conducted a retrospective, descriptive evaluation of the fidelity with which parents of three children with autism spectrum disorders conducted functional communication training (FCT) in their homes. All training was provided to the parents via telehealth by a behavior consultant in a tertiary-level hospital setting. FCT trials coached by the…

  18. Special software for computing the special functions of wave catastrophes

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  19. Accurate and efficient computation of synchrotron radiation functions

    MacLeod, Allan J.

    2000-01-01

    We consider the computation of three functions which appear in the theory of synchrotron radiation. These are F(x)=x∫x∞K 5/3 (y) dy))F p (x)=xK 2/3 (x) and G p (x)=x 1/3 K 1/3 (x), where K ν denotes a modified Bessel function. Chebyshev series coefficients are given which enable the functions to be computed with an accuracy of up to 15 sig. figures

  20. A large-scale evaluation of computational protein function prediction

    Radivojac, P.; Clark, W.T.; Oron, T.R.; Schnoes, A.M.; Wittkop, T.; Kourmpetis, Y.A.I.; Dijk, van A.D.J.; Friedberg, I.

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be

  1. Numerical computation of special functions with applications to physics

    Motsepe, K

    2008-09-01

    Full Text Available Students of mathematical physics, engineering, natural and biological sciences sometimes need to use special functions that are not found in ordinary mathematical software. In this paper a simple universal numerical algorithm is developed to compute...

  2. Supporting executive functions during children's preliteracy learning with the computer

    Sande, E. van de; Segers, P.C.J.; Verhoeven, L.T.W.

    2016-01-01

    The present study examined how embedded activities to support executive functions helped children to benefit from a computer intervention that targeted preliteracy skills. Three intervention groups were compared on their preliteracy gains in a randomized controlled trial design: an experimental

  3. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  4. Algebraic Functions, Computer Programming, and the Challenge of Transfer

    Schanzer, Emmanuel Tanenbaum

    2015-01-01

    Students' struggles with algebra are well documented. Prior to the introduction of functions, mathematics is typically focused on applying a set of arithmetic operations to compute an answer. The introduction of functions, however, marks the point at which mathematics begins to focus on building up abstractions as a way to solve complex problems.…

  5. Numerical computation of aeroacoustic transfer functions for realistic airfoils

    De Santana, Leandro Dantas; Miotto, Renato Fuzaro; Wolf, William Roberto

    2017-01-01

    Based on Amiet's theory formalism, we propose a numerical framework to compute the aeroacoustic transfer function of realistic airfoil geometries. The aeroacoustic transfer function relates the amplitude and phase of an incoming periodic gust to the respective unsteady lift response permitting,

  6. Geometric optical transfer function and tis computation method

    Wang Qi

    1992-01-01

    Geometric Optical Transfer Function formula is derived after expound some content to be easily ignored, and the computation method is given with Bessel function of order zero and numerical integration and Spline interpolation. The method is of advantage to ensure accuracy and to save calculation

  7. A computer program for the pointwise functions generation

    Caldeira, Alexandre D.

    1995-01-01

    A computer program that was developed with the objective of generating pointwise functions, by a combination of tabulated values and/or mathematical expressions, to be used as weighting functions for nuclear data is presented. This simple program can be an important tool for researchers involved in group constants generation. (author). 5 refs, 2 figs

  8. Positive Wigner functions render classical simulation of quantum computation efficient.

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  9. Inferring biological functions of guanylyl cyclases with computational methods

    Alquraishi, May Majed; Meier, Stuart Kurt

    2013-01-01

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  10. Inferring biological functions of guanylyl cyclases with computational methods

    Alquraishi, May Majed

    2013-09-03

    A number of studies have shown that functionally related genes are often co-expressed and that computational based co-expression analysis can be used to accurately identify functional relationships between genes and by inference, their encoded proteins. Here we describe how a computational based co-expression analysis can be used to link the function of a specific gene of interest to a defined cellular response. Using a worked example we demonstrate how this methodology is used to link the function of the Arabidopsis Wall-Associated Kinase-Like 10 gene, which encodes a functional guanylyl cyclase, to host responses to pathogens. © Springer Science+Business Media New York 2013.

  11. The Computational Processing of Intonational Prominence: A Functional Prosody Perspective

    Nakatani, Christine Hisayo

    1997-01-01

    Intonational prominence, or accent, is a fundamental prosodic feature that is said to contribute to discourse meaning. This thesis outlines a new, computational theory of the discourse interpretation of prominence, from a FUNCTIONAL PROSODY perspective. Functional prosody makes the following two important assumptions: first, there is an aspect of prominence interpretation that centrally concerns discourse processes, namely the discourse focusing nature of prominence; and second, the role of p...

  12. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  13. Computational design of proteins with novel structure and functions

    Yang Wei; Lai Lu-Hua

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence–structure–function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein–protein interactions. Challenges and future prospects of this field are also discussed. (topical review)

  14. Fast computation of complete elliptic integrals and Jacobian elliptic functions

    Fukushima, Toshio

    2009-12-01

    As a preparation step to compute Jacobian elliptic functions efficiently, we created a fast method to calculate the complete elliptic integral of the first and second kinds, K( m) and E( m), for the standard domain of the elliptic parameter, 0 procedure to compute simultaneously three Jacobian elliptic functions, sn( u| m), cn( u| m), and dn( u| m), by repeated usage of the double argument formulae starting from the Maclaurin series expansions with respect to the elliptic argument, u, after its domain is reduced to the standard range, 0 ≤ u procedure is 25-70% faster than the methods based on the Gauss transformation such as Bulirsch’s algorithm, sncndn, quoted in the Numerical Recipes even if the acceleration of computation of K( m) is not taken into account.

  15. Computing exact bundle compliance control charts via probability generating functions.

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  16. Fast and accurate computation of projected two-point functions

    Grasshorn Gebhardt, Henry S.; Jeong, Donghui

    2018-01-01

    We present the two-point function from the fast and accurate spherical Bessel transformation (2-FAST) algorithm1Our code is available at https://github.com/hsgg/twoFAST. for a fast and accurate computation of integrals involving one or two spherical Bessel functions. These types of integrals occur when projecting the galaxy power spectrum P (k ) onto the configuration space, ξℓν(r ), or spherical harmonic space, Cℓ(χ ,χ'). First, we employ the FFTLog transformation of the power spectrum to divide the calculation into P (k )-dependent coefficients and P (k )-independent integrations of basis functions multiplied by spherical Bessel functions. We find analytical expressions for the latter integrals in terms of special functions, for which recursion provides a fast and accurate evaluation. The algorithm, therefore, circumvents direct integration of highly oscillating spherical Bessel functions.

  17. Computing the hadronic vacuum polarization function by analytic continuation

    Feng, Xu [KEK National High Energy Physics, Tsukuba (Japan); Hashimoto, Shoji [KEK National High Energy Physics, Tsukuba (Japan); The Graduate Univ. for Advanced Studies, Tsukuba (Japan). School of High Energy Accelerator Science; Hotzel, Grit [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Petschlies, Marcus [The Cyprus Institute, Nicosia (Cyprus); Renner, Dru B. [Thomas Jefferson National Accelerator Facility, Newport News, VA (United States)

    2013-07-15

    We propose a method to compute the hadronic vacuum polarization function on the lattice at continuous values of photon momenta bridging between the space-like and time-like regions. We provide two independent derivations of this method showing that it leads to the desired hadronic vacuum polarization function in Minkowski space-time. We show with the example of the leading- order QCD correction to the muon anomalous magnetic moment that this approach can provide a valuable alternative method for calculations of physical quantities where the hadronic vacuum polarization function enters.

  18. Variance computations for functional of absolute risk estimates.

    Pfeiffer, R M; Petracci, E

    2011-07-01

    We present a simple influence function based approach to compute the variances of estimates of absolute risk and functions of absolute risk. We apply this approach to criteria that assess the impact of changes in the risk factor distribution on absolute risk for an individual and at the population level. As an illustration we use an absolute risk prediction model for breast cancer that includes modifiable risk factors in addition to standard breast cancer risk factors. Influence function based variance estimates for absolute risk and the criteria are compared to bootstrap variance estimates.

  19. Computing three-point functions for short operators

    Bargheer, Till; Institute for Advanced Study, Princeton, NJ; Minahan, Joseph A.; Pereira, Raul

    2013-11-01

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  20. Computing three-point functions for short operators

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  1. Structure, function, and behaviour of computational models in systems biology.

    Knüpfer, Christian; Beckstein, Clemens; Dittrich, Peter; Le Novère, Nicolas

    2013-05-31

    Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such "bio-models" necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. We present a conceptual framework - the meaning facets - which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model's components (structure), the meaning of the model's intended use (function), and the meaning of the model's dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research.

  2. A hybrid method for the parallel computation of Green's functions

    Petersen, Dan Erik; Li, Song; Stokbro, Kurt

    2009-01-01

    of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds...... of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only...... require computing a small number of entries of the inverse matrix. Then. we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size....

  3. Image reconstruction of computed tomograms using functional algebra

    Bradaczek, M.; Bradaczek, H.

    1997-01-01

    A detailed presentation of the process for calculating computed tomograms from the measured data by means of functional algebra is given and an attempt is made to demonstrate the relationships to those inexperienced in mathematics. Suggestions are also made to the manufacturers for improving tomography software although the authors cannot exclude the possibility that some of the recommendations may have already been realized. An interpolation in Fourier space to right-angled coordinates was not employed so that additional computer time and errors resulting from the interpolation are avoided. The savings in calculation time can only be estimated but should amount to about 25%. The error-correction calculation is merely a suggestion since it depends considerably on the apparatus used. Functional algebra is introduced here because it is not so well known but does provide appreciable simplifications in comparison to an explicit presentation. Didactic reasons as well as the possibility for reducing calculation time provided the foundation for this work. (orig.) [de

  4. FCJ-131 Pervasive Computing and Prosopopoietic Modelling – Notes on computed function and creative action

    Anders Michelsen

    2011-12-01

    Full Text Available This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997 terms, ‘invisible’, on the horizon, ’calm’, it also points to a much more important and slightly different perspective: that of creative action upon novel forms of artifice. Most importantly for this article, ubiquity and pervasive computing is seen to point to the continuous existence throughout the computational heritage since the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty relationship of extension of one into the other. It also indicates a generative creation that itself points to important issues of ontology with methodological implications for the design of computing. In this article these implications will be conceptualised as prosopopoietic modeling on the basis of Bernward Joerges introduction of the classical rhetoric term of ’prosopopoeia’ into the debate on large technological systems. First, the paper introduces the paradoxical distinction/complicity by debating Gilbert Simondon’s notion of a ‘margin of indeterminacy’ vis-a-vis computing. Second, it debates the idea of prosopopoietic modeling, pointing to a principal role of the paradoxical distinction/complicity within the computational heritage in three cases: a. Prosopopoietic

  5. Using computational models to relate structural and functional brain connectivity

    Hlinka, Jaroslav; Coombes, S.

    2012-01-01

    Roč. 36, č. 2 (2012), s. 2137-2145 ISSN 0953-816X R&D Projects: GA MŠk 7E08027 EU Projects: European Commission(XE) 200728 - BRAINSYNC Institutional research plan: CEZ:AV0Z10300504 Keywords : brain disease * computational modelling * functional connectivity * graph theory * structural connectivity Subject RIV: FH - Neurology Impact factor: 3.753, year: 2012

  6. Brookhaven Reactor Experiment Control Facility, a distributed function computer network

    Dimmler, D.G.; Greenlaw, N.; Kelley, M.A.; Potter, D.W.; Rankowitz, S.; Stubblefield, F.W.

    1975-11-01

    A computer network for real-time data acquisition, monitoring and control of a series of experiments at the Brookhaven High Flux Beam Reactor has been developed and has been set into routine operation. This reactor experiment control facility presently services nine neutron spectrometers and one x-ray diffractometer. Several additional experiment connections are in progress. The architecture of the facility is based on a distributed function network concept. A statement of implementation and results is presented

  7. Computation of bessel functions in light scattering studies.

    Ross, W D

    1972-09-01

    Computations of light scattering require finding Bessel functions of a series of orders. These are found most easily by recurrence, but excessive rounding errors may accumulate. Satisfactory procedures for cylinder and sphere functions are described. If argument z is real, find Y(n)(z) by recurrence to high orders. From two high orders of Y(n)(z) estimate J(n)(z). Use backward recurrence to maximum J(n)(z). Correct by forward recurrence to maximum. If z is complex, estimate high orders of J(n)(z) without Y(n)(z) and use backward recurrence.

  8. Computations of nuclear response functions with MACK-IV

    Abdou, M.A.; Gohar, Y.

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications

  9. Efficient quantum algorithm for computing n-time correlation functions.

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  10. Computations of nuclear response functions with MACK-IV

    Abdou, M A; Gohar, Y

    1978-01-01

    The MACK computer program calculates energy pointwise and multigroup nuclear response functions from basic nuclear data in ENDF/B format. The new version of the program, MACK-IV, incorporates major developments and improvements aimed at maximizing the utilization of available nuclear data and ensuring energy conservation in nuclear heating calculations. A new library, MACKLIB-IV, of nuclear response functions was generated in the CTR energy group structure of 171 neutron groups and 36 gamma groups. The library was prepared using MACK-IV and ENDF/B-IV and is suitable for fusion, fusion-fission hybrids, and fission applications.

  11. Optimized Kaiser-Bessel Window Functions for Computed Tomography.

    Nilchian, Masih; Ward, John Paul; Vonesch, Cedric; Unser, Michael

    2015-11-01

    Kaiser-Bessel window functions are frequently used to discretize tomographic problems because they have two desirable properties: 1) their short support leads to a low computational cost and 2) their rotational symmetry makes their imaging transform independent of the direction. In this paper, we aim at optimizing the parameters of these basis functions. We present a formalism based on the theory of approximation and point out the importance of the partition-of-unity condition. While we prove that, for compact-support functions, this condition is incompatible with isotropy, we show that minimizing the deviation from the partition of unity condition is highly beneficial. The numerical results confirm that the proposed tuning of the Kaiser-Bessel window functions yields the best performance.

  12. Global sensitivity analysis of computer models with functional inputs

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  13. A hybrid method for the parallel computation of Green's functions

    Petersen, Dan Erik; Li Song; Stokbro, Kurt; Sorensen, Hans Henrik B.; Hansen, Per Christian; Skelboe, Stig; Darve, Eric

    2009-01-01

    Quantum transport models for nanodevices using the non-equilibrium Green's function method require the repeated calculation of the block tridiagonal part of the Green's and lesser Green's function matrices. This problem is related to the calculation of the inverse of a sparse matrix. Because of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only require computing a small number of entries of the inverse matrix. Then, we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size.

  14. Assessing Mand Topography Preference When Developing a Functional Communication Training Intervention.

    Kunnavatana, S Shanun; Wolfe, Katie; Aguilar, Alexandra N

    2018-05-01

    Functional communication training (FCT) is a common function-based behavioral intervention used to decrease problem behavior by teaching an alternative communication response. Therapists often arbitrarily select the topography of the alternative response, which may influence long-term effectiveness of the intervention. Assessing individual mand topography preference may increase treatment effectiveness and promote self-determination in the development of interventions. This study sought to reduce arbitrary selection of FCT mand topography by determining preference during response training and acquisition for two adults with autism who had no functional communication skills. Both participants demonstrated a clear preference for one mand topography during choice probes, and the preferred topography was then reinforced during FCT to reduce problem behavior and increase independent communication. The implications of the results for future research on mand selection during FCT are discussed.

  15. Functional Communication Training

    Durand, V. Mark; Moskowitz, Lauren

    2015-01-01

    Thirty years ago, the first experimental demonstration was published showing that educators could improve significant challenging behavior in children with disabilities by replacing these behaviors with forms of communication that served the same purpose, a procedure called functional communication training (FCT). Since the publication of that…

  16. Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics

    Ismail, Mourad

    2001-01-01

    These are the proceedings of the conference "Symbolic Computation, Number Theory, Special Functions, Physics and Combinatorics" held at the Department of Mathematics, University of Florida, Gainesville, from November 11 to 13, 1999. The main emphasis of the conference was Com­ puter Algebra (i. e. symbolic computation) and how it related to the fields of Number Theory, Special Functions, Physics and Combinatorics. A subject that is common to all of these fields is q-series. We brought together those who do symbolic computation with q-series and those who need q-series in­ cluding workers in Physics and Combinatorics. The goal of the conference was to inform mathematicians and physicists who use q-series of the latest developments in the field of q-series and especially how symbolic computa­ tion has aided these developments. Over 60 people were invited to participate in the conference. We ended up having 45 participants at the conference, including six one hour plenary speakers and 28 half hour speakers. T...

  17. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...

  18. On the effectiveness of and preference for punishment and extinction components of function-based interventions.

    Hanley, Gregory P; Piazza, Cathleen C; Fisher, Wayne W; Maglieri, Kristen A

    2005-01-01

    The current study describes an assessment sequence that may be used to identify individualized, effective, and preferred interventions for severe problem behavior in lieu of relying on a restricted set of treatment options that are assumed to be in the best interest of consumers. The relative effectiveness of functional communication training (FCT) with and without a punishment component was evaluated with 2 children for whom functional analyses demonstrated behavioral maintenance via social positive reinforcement. The results showed that FCT plus punishment was more effective than FCT in reducing problem behavior. Subsequently, participants' relative preference for each treatment was evaluated in a concurrent-chains arrangement, and both participants demonstrated a dear preference for FCT with punishment. These findings suggest that the treatment-selection process may be guided by person-centered and evidence-based values.

  19. Computer functions in overall plant control of candu generating stations

    Chou, Q.B.; Stokes, H.W.

    1976-01-01

    System Planning Specifications form the basic requirements for the performance of the plant including its response to abnormal situations. The rules for the computer control programs are devised from these, taking into account limitations imposed by the reactor, heat transport and turbine-generator systems. The paper outlines these specifications and the limitations imposed by the major items of plant equipment. It describes the functions of each of the main programs, their interactions and the control modes used in the existing Ontario Hydro's nuclear station or proposed for future stations. Some simulation results showing the performance of the overall unit control system and plans for future studies are discussed. (orig.) [de

  20. Computing the effective action with the functional renormalization group

    Codello, Alessandro [CP3-Origins and the Danish IAS University of Southern Denmark, Odense (Denmark); Percacci, Roberto [SISSA, Trieste (Italy); INFN, Sezione di Trieste, Trieste (Italy); Rachwal, Leslaw [Fudan University, Department of Physics, Center for Field Theory and Particle Physics, Shanghai (China); Tonero, Alberto [ICTP-SAIFR and IFT, Sao Paulo (Brazil)

    2016-04-15

    The ''exact'' or ''functional'' renormalization group equation describes the renormalization group flow of the effective average action Γ{sub k}. The ordinary effective action Γ{sub 0} can be obtained by integrating the flow equation from an ultraviolet scale k = Λ down to k = 0. We give several examples of such calculations at one-loop, both in renormalizable and in effective field theories. We reproduce the four-point scattering amplitude in the case of a real scalar field theory with quartic potential and in the case of the pion chiral Lagrangian. In the case of gauge theories, we reproduce the vacuum polarization of QED and of Yang-Mills theory. We also compute the two-point functions for scalars and gravitons in the effective field theory of scalar fields minimally coupled to gravity. (orig.)

  1. Computational Models for Calcium-Mediated Astrocyte Functions

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  2. Computational Models for Calcium-Mediated Astrocyte Functions.

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  3. An evolutionary computation approach to examine functional brain plasticity

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  4. Computer Modeling of the Earliest Cellular Structures and Functions

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membranestructures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  5. Trial-Based Functional Analysis and Functional Communication Training in an Early Childhood Setting

    Lambert, Joseph M.; Bloom, Sarah E.; Irvin, Jennifer

    2012-01-01

    Problem behavior is common in early childhood special education classrooms. Functional communication training (FCT; Carr & Durand, 1985) may reduce problem behavior but requires identification of its function. The trial-based functional analysis (FA) is a method that can be used to identify problem behavior function in schools. We conducted…

  6. Computing the Partition Function for Kinetically Trapped RNA Secondary Structures

    Lorenz, William A.; Clote, Peter

    2011-01-01

    An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in time and space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1) the number of locally optimal structures is far fewer than the total number of structures – indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2) the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3) the (modified) maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model) can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected accuracy. Web server

  7. Computing the partition function for kinetically trapped RNA secondary structures.

    William A Lorenz

    Full Text Available An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in O(n3 time and O(n2 space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1 the number of locally optimal structures is far fewer than the total number of structures--indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2 the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3 the (modified maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected

  8. Automatic quantitative analysis of liver functions by a computer system

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  9. Imaging local brain function with emission computed tomography

    Kuhl, D.E.

    1984-01-01

    Positron emission tomography (PET) using 18 F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed

  10. The Impact of Computer Use on Learning of Quadratic Functions

    Pihlap, Sirje

    2017-01-01

    Studies of the impact of various types of computer use on the results of learning and student motivation have indicated that the use of computers can increase learning motivation, and that computers can have a positive effect, a negative effect, or no effect at all on learning outcomes. Some results indicate that it is not computer use itself that…

  11. A computer vision based candidate for functional balance test.

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  12. Specific features of vocal fold paralysis in functional computed tomography

    Laskowska, K.; Mackiewicz-Nartowicz, H.; Serafin, Z.; Nawrocka, E.

    2008-01-01

    Vocal fold paralysis is usually recognized in laryngological examination, and detailed vocal fold function may be established based on laryngovideostroboscopy. Additional imaging should exclude any morphological causes of the paresis, which should be treated pharmacologically or surgically. The aim of this paper was to analyze the computed tomography (CT) images of the larynx in patients with unilateral vocal fold paralysis. CT examinations of the larynx were performed in 10 patients with clinically defined unilateral vocal fold paralysis. The examinations consisted of unenhanced acquisition and enhanced 3-phased acquisition: during free breathing, Valsalva maneuver, and phonation. The analysis included the following morphologic features of the paresis.the deepened epiglottic vallecula, the deepened piriform recess, the thickened and medially positioned aryepiglottic fold, the widened laryngeal pouch, the anteriorly positioned arytenoid cartilage, the thickened vocal fold, and the filled infraglottic space in frontal CT reconstruction. CT images were compared to laryngovideostroboscopy. The most common symptoms of vocal cord paralysis in CT were the deepened epiglottic vallecula and piriform recess, the widened laryngeal pouch with the filled infraglottic space, and the thickened aryepiglottic fold. Regarding the efficiency of the paralysis determination, the three functional techniques of CT larynx imaging used did not differ significantly, and laryngovideostroboscopy demonstrated its advantage over CT. CT of the larynx is a supplementary examination in the diagnosis of vocal fold paralysis, which may enable topographic analysis of the fold dysfunction. The knowledge of morphological CT features of the paralysis may help to prevent false-positive diagnosis of laryngeal cancer. (author)

  13. Functional magnetic resonance maps obtained by personal computer

    Gomez, F. j.; Manjon, J. V.; Robles, M.; Marti-Bonmati, L.; Dosda, R.; Molla, E.

    2001-01-01

    Functional magnetic resonance (fMR) is of special relevance in the analysis of certain types of brain activation. The present report describes the development of a simple software program for use with personal computers (PCs) that analyzes these images and provides functional activation maps. Activation maps are based on the temporal differences in oxyhemoglobin in tomographic images. To detect these differences, intensities registered repeatedly during brain control and activation are compared. The experiments were performed with a 1.5-Tesla MR unit. To verify the reliability of the program fMR studies were carried out in 4 healthy individuals (12 contiguous slices, 80 images per slice every 3.1 seconds for a total of 960 images). All the images were transferred to a PC and were processed pixel by pixel within each sequence to obtain an intensity/time curve. The statistical study of the results (Student's test and cross correlation analysis) made it possible to establish the activation of each pixel. The images were prepared using spatial filtering, temporal filtering, baseline correction, normalization and segmentation of the parenchyma. The postprocessing of the results involved the elimination of single pixels, superposition of an anatomical image of greater spatial resolution and anti-aliasing. The application (Xfun 1.0, Valencia, Spain) was developed in Microsoft Visual C++5.0 Developer Studio for Windows NT Workstation. As a representative example, the program took 8.2 seconds to calculate and present the results of the entire study (12 functional maps). In the motor and visual activation experiments, the activation corresponding to regions proximal to the central sulcus of the hemisphere contralateral to the hand that moved and in the occipital cortex were observed. While programs that calculate activation maps are available, the development of software for PCs running Microsoft Windows ensures several key features for its use on a daily basis: it is easy

  14. Computation of Value Functions in Nonlinear Differential Games with State Constraints

    Botkin, Nikolai; Hoffmann, Karl-Heinz; Mayer, Natalie; Turova, Varvara

    2013-01-01

    Finite-difference schemes for the computation of value functions of nonlinear differential games with non-terminal payoff functional and state constraints are proposed. The solution method is based on the fact that the value function is a

  15. Computational Benchmarking for Ultrafast Electron Dynamics: Wave Function Methods vs Density Functional Theory.

    Oliveira, Micael J T; Mignolet, Benoit; Kus, Tomasz; Papadopoulos, Theodoros A; Remacle, F; Verstraete, Matthieu J

    2015-05-12

    Attosecond electron dynamics in small- and medium-sized molecules, induced by an ultrashort strong optical pulse, is studied computationally for a frozen nuclear geometry. The importance of exchange and correlation effects on the nonequilibrium electron dynamics induced by the interaction of the molecule with the strong optical pulse is analyzed by comparing the solution of the time-dependent Schrödinger equation based on the correlated field-free stationary electronic states computed with the equationof-motion coupled cluster singles and doubles and the complete active space multi-configurational self-consistent field methodologies on one hand, and various functionals in real-time time-dependent density functional theory (TDDFT) on the other. We aim to evaluate the performance of the latter approach, which is very widely used for nonlinear absorption processes and whose computational cost has a more favorable scaling with the system size. We focus on LiH as a toy model for a nontrivial molecule and show that our conclusions carry over to larger molecules, exemplified by ABCU (C10H19N). The molecules are probed with IR and UV pulses whose intensities are not strong enough to significantly ionize the system. By comparing the evolution of the time-dependent field-free electronic dipole moment, as well as its Fourier power spectrum, we show that TD-DFT performs qualitatively well in most cases. Contrary to previous studies, we find almost no changes in the TD-DFT excitation energies when excited states are populated. Transitions between states of different symmetries are induced using pulses polarized in different directions. We observe that the performance of TD-DFT does not depend on the symmetry of the states involved in the transition.

  16. COMPUTING

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. Discrete-Trial Functional Analysis and Functional Communication Training with Three Individuals with Autism and Severe Problem Behavior

    Schmidt, Jonathan D.; Drasgow, Erik; Halle, James W.; Martin, Christian A.; Bliss, Sacha A.

    2014-01-01

    Discrete-trial functional analysis (DTFA) is an experimental method for determining the variables maintaining problem behavior in the context of natural routines. Functional communication training (FCT) is an effective method for replacing problem behavior, once identified, with a functionally equivalent response. We implemented these procedures…

  18. A brain-computer interface to support functional recovery

    Kjaer, Troels W; Sørensen, Helge Bjarup Dissing

    2013-01-01

    Brain-computer interfaces (BCI) register changes in brain activity and utilize this to control computers. The most widely used method is based on registration of electrical signals from the cerebral cortex using extracranially placed electrodes also called electroencephalography (EEG). The features...... extracted from the EEG may, besides controlling the computer, also be fed back to the patient for instance as visual input. This facilitates a learning process. BCI allow us to utilize brain activity in the rehabilitation of patients after stroke. The activity of the cerebral cortex varies with the type...... of movement we imagine, and by letting the patient know the type of brain activity best associated with the intended movement the rehabilitation process may be faster and more efficient. The focus of BCI utilization in medicine has changed in recent years. While we previously focused on devices facilitating...

  19. The analysis of gastric function using computational techniques

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  20. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  1. Computer-controlled mechanical lung model for application in pulmonary function studies

    A.F.M. Verbraak (Anton); J.E.W. Beneken; J.M. Bogaard (Jan); A. Versprille (Adrian)

    1995-01-01

    textabstractA computer controlled mechanical lung model has been developed for testing lung function equipment, validation of computer programs and simulation of impaired pulmonary mechanics. The construction, function and some applications are described. The physical model is constructed from two

  2. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  3. Functional Automata - Formal Languages for Computer Science Students

    Marco T. Morazán

    2014-12-01

    Full Text Available An introductory formal languages course exposes advanced undergraduate and early graduate students to automata theory, grammars, constructive proofs, computability, and decidability. Programming students find these topics to be challenging or, in many cases, overwhelming and on the fringe of Computer Science. The existence of this perception is not completely absurd since students are asked to design and prove correct machines and grammars without being able to experiment nor get immediate feedback, which is essential in a learning context. This article puts forth the thesis that the theory of computation ought to be taught using tools for actually building computations. It describes the implementation and the classroom use of a library, FSM, designed to provide students with the opportunity to experiment and test their designs using state machines, grammars, and regular expressions. Students are able to perform random testing before proceeding with a formal proof of correctness. That is, students can test their designs much like they do in a programming course. In addition, the library easily allows students to implement the algorithms they develop as part of the constructive proofs they write. Providing students with this ability ought to be a new trend in the formal languages classroom.

  4. Design, functioning and possible applications of process computers

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  5. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  6. Computer-assisted modeling: Contributions of computational approaches to elucidating macromolecular structure and function: Final report

    Walton, S.

    1987-01-01

    The Committee, asked to provide an assessment of computer-assisted modeling of molecular structure, has highlighted the signal successes and the significant limitations for a broad panoply of technologies and has projected plausible paths of development over the next decade. As with any assessment of such scope, differing opinions about present or future prospects were expressed. The conclusions and recommendations, however, represent a consensus of our views of the present status of computational efforts in this field

  7. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  8. A brain-computer interface to support functional recovery.

    Kjaer, Troels W; Sørensen, Helge B

    2013-01-01

    Brain-computer interfaces (BCI) register changes in brain activity and utilize this to control computers. The most widely used method is based on registration of electrical signals from the cerebral cortex using extracranially placed electrodes also called electroencephalography (EEG). The features extracted from the EEG may, besides controlling the computer, also be fed back to the patient for instance as visual input. This facilitates a learning process. BCI allow us to utilize brain activity in the rehabilitation of patients after stroke. The activity of the cerebral cortex varies with the type of movement we imagine, and by letting the patient know the type of brain activity best associated with the intended movement the rehabilitation process may be faster and more efficient. The focus of BCI utilization in medicine has changed in recent years. While we previously focused on devices facilitating communication in the rather few patients with locked-in syndrome, much interest is now devoted to the therapeutic use of BCI in rehabilitation. For this latter group of patients, the device is not intended to be a lifelong assistive companion but rather a 'teacher' during the rehabilitation period. Copyright © 2013 S. Karger AG, Basel.

  9. Computational study on the functionalization of BNNC with pyrrole molecule

    Payvand, Akram; Tavangar, Zahra

    2018-05-01

    The functionalization of the boron nitride nanocone (BNNC) by pyrrole molecule was studied using B3LYP/6-311+G(d) level of theory. The reaction was studied in three methods in different layers of the nanocone: Diels-Alder cycloaddition, quartet cycloaddition and the reaction of the nitrogen atom of the pyrrole molecule with the boron or nitrogen atom of the BNNC. Thermodynamic quantities, Chemical hardness and potential and electrophilicity index of the functionalized BNNC were studied. The results show that the tip of nanocone has a higher tendency for participation in the reaction and the most favorable product of the reaction between BNNC and pyrrole molecule is produced from the reaction of N atom of pyrrole with the B atom of BNNC. The reaction decreases the energy gap value which leads to increasing the reactivity and conductivity of functionalized nanocone. The calculated NICS values confirm the aromaticity in the pristine nanocone as well as in the functionalized nanocone.

  10. Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.

    Snow, Donald R.

    1989-01-01

    Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)

  11. On the Hierarchy of Functioning Rules in Distributed Computing

    Bui , Alain; Bui , Marc; Lavault , Christian

    1999-01-01

    International audience; In previous papers, we used a Markovian model to determine the optimal functioning rules of a distributed system in various settings. Searching optimal functioning rules amounts to solve an optimization problem under constraints. The hierarchy of solutions arising from the above problem is called the “first order hierarchy”, and may possibly yield equivalent solutions. The present paper emphasizes a specific technique for deciding between two equivalent solutions, whic...

  12. Individual renal function study using dynamic computed tomography

    Fukuda, Yutaka; Kiya, Keiichi; Suzuki, Yoshiharu

    1990-01-01

    Dynamic CT scans of individual kindneys were obtained after an intravenous bolus injection of contrast agent. Time-density curves measured from the renal cortex, medulla and pelvis revealed the changes in density produced by the contrast agent reflecting the differential phase of renal function. Renal cortical density increased rapidly after bolus administration and then renal medullary and pelvic density increased continuously. In analyzing time-density curve, the cortico-medullary junction time, which is the time when the cortical and medullary curves cross was 57±8 seconds in patients with normal renal function. The cortico-medullary junction time was delayed in patient with decreased glomerular filtration rate. The cortico-pelvic junction time, which is the time when the cortical and pelvic curves cross was 104±33 seconds in patients with normal renal function. The cortico-pelvic junction time was delayed in patients with declined urinary concentrating capacity. In patients with unilateral renal agenesis and patients who were treated surgically by ureteral sprits, the relationship between individual renal functions and these junction times was examined. As a result of study there were inversely significant correlations between C-M junction time and unilateral GFR and between C-P junction time and urinary concentrating capacity. These studies indicate that dynamic CT scanning is an effective way that individual renal function can be monitored and evaluated. (author)

  13. Bread dough rheology: Computing with a damage function model

    Tanner, Roger I.; Qi, Fuzhong; Dai, Shaocong

    2015-01-01

    We describe an improved damage function model for bread dough rheology. The model has relatively few parameters, all of which can easily be found from simple experiments. Small deformations in the linear region are described by a gel-like power-law memory function. A set of large non-reversing deformations - stress relaxation after a step of shear, steady shearing and elongation beginning from rest, and biaxial stretching, is used to test the model. With the introduction of a revised strain measure which includes a Mooney-Rivlin term, all of these motions can be well described by the damage function described in previous papers. For reversing step strains, larger amplitude oscillatory shearing and recoil reasonable predictions have been found. The numerical methods used are discussed and we give some examples.

  14. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps.

    Plasser, Felix; Ruckenbauer, Matthias; Mai, Sebastian; Oppel, Markus; Marquetand, Philipp; González, Leticia

    2016-03-08

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented.

  15. Computation of load functions for different types of aircraft

    Siefert, Alexander; Henkel, Fritz-Otto

    2013-01-01

    In the presentation the influence of different parameters on the Ft-function were shown. The increase of the impact velocity shows for all aircraft a higher maximal load value and a reduced impact time. Due to the structural setup of the aircraft's the intensity is of these effects different. Comparing the Ft-function of A320, A340 and A380 for an impact velocity of 100 and 175 m/s no constant relation between them can be determined. • The variation of the flight direction with respect to the vertical axis shows a great influence on the Ft-function. A approximation by the cosine is especially for bigger rotations not correct. The influence of the rotation about the horizontal axis can be neglected. Finally the SPH-method was applied for the modelling of the fuel. The comparison to the discrete modelling approach was carried out for the Phantom F4. Thereby no big influence on the Ft-function is observed. For the evaluation of this modelling approach on the local damage the loaded area must be determined in further investigations

  16. Computational approaches to identify functional genetic variants in cancer genomes

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  17. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  18. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  19. Memory intensive functional architecture for distributed computer control systems

    Dimmler, D.G.

    1983-10-01

    A memory-intensive functional architectue for distributed data-acquisition, monitoring, and control systems with large numbers of nodes has been conceptually developed and applied in several large-scale and some smaller systems. This discussion concentrates on: (1) the basic architecture; (2) recent expansions of the architecture which now become feasible in view of the rapidly developing component technologies in microprocessors and functional large-scale integration circuits; and (3) implementation of some key hardware and software structures and one system implementation which is a system for performing control and data acquisition of a neutron spectrometer at the Brookhaven High Flux Beam Reactor. The spectrometer is equipped with a large-area position-sensitive neutron detector

  20. A functional analytic approach to computer-interactive mathematics.

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M; Ninness, Sharon K

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed.

  1. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  2. A Functional Specification for a Programming Language for Computer Aided Learning Applications.

    National Research Council of Canada, Ottawa (Ontario).

    In 1972 there were at least six different course authoring languages in use in Canada with little exchange of course materials between Computer Assisted Learning (CAL) centers. In order to improve facilities for producing "transportable" computer based course materials, a working panel undertook the definition of functional requirements of a user…

  3. Logical and physical resource management in the common node of a distributed function laboratory computer network

    Stubblefield, F.W.

    1976-01-01

    A scheme for managing resources required for transaction processing in the common node of a distributed function computer system has been given. The scheme has been found to be satisfactory for all common node services provided so far

  4. Computer Processing and Display of Positron Scintigrams and Dynamic Function Curves

    Wilensky, S.; Ashare, A. B.; Pizer, S. M.; Hoop, B. Jr.; Brownell, G. L. [Massachusetts General Hospital, Boston, MA (United States)

    1969-01-15

    A computer processing and display system for handling radioisotope data is described. The system has been used to upgrade and display brain scans and to process dynamic function curves. The hardware and software are described, and results are presented. (author)

  5. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    SCAIEF, C.C.

    1999-01-01

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring

  6. Solitary pulmonary nodules: impact of functional CT on the cost-effectiveness of FDG-PET

    Miles, K.A.; Keith, C.J.; Wong, D.C.; Griffiths, M.R.

    2002-01-01

    Full text: FDG-PET has been shown to be cost-effective for the evaluation of solitary pulmonary nodules (SPNs) in Australia. This study evaluates the impact on cost-effectiveness produced by incorporating a novel CT technique, functional CT, into diagnostic algorithms for characterisation of SPNs. Four diagnostic strategies were evaluated using decision tree sensitivity analysis. The first strategy comprised patients undergoing conventional CT alone (CT). The second comprised conventional CT followed by functional CT study (FCT), when the SPN was not benign on conventional CT. The third strategy comprised conventional CT, which if positive is followed by FDG-PET (PET) and a fourth strategy where patients with a positive conventional CT undergo functional CT, which if positive also undergo FDG-PET (FCT+PET). Values for disease prevalence and diagnostic accuracy of PET, CT and functional CT were obtained from a literature review, using Australia values where available. Procedure costs were derived from the Medicare Benefits Schedule and DRG Cost Weights for Australian public hospitals. The cost per patient, accuracy and Incremental Cost-Accuracy Ratio (ICAR) were determined for each strategy. Sensitivity analysis evaluated the effect of disease prevalence on cost-effectiveness. Results: At the prevalence of malignancy reported from Australian series (54%), the FCT strategy incurs the least cost ($5560/patient), followed by the FCT+PET ($5910/patient). The FCT+PET strategy is the most cost-effective strategy with an ICAR of $12059/patient, followed by the PET strategy with an ICAR of $12300/patient. At levels of disease prevalence below 54% the above relationship for cost-effectiveness remains the same. For high levels of disease prevalence, CT or FCT are found to be more cost-effective. At typical prevalence of malignancy the cost-effectiveness of PET is enhanced by the addition of functional CT, but at high prevalence functional CT alone is most cost

  7. Functional diagnostics of the cervical spine by using computer tomography

    Dvorak, J.; Hayek, J.; Grob, D.

    1988-01-01

    35 healthy adults and 137 patients after cervical spine injury were examined by functional CT. The range of axial rotation at the level occiput/atlas, atlas/axis and the segment below were measured in all subjects. A rotation occiput/atlas of more than 7 0 , and C1/C2 more than 54 0 could refer to segmental hypermobility, a rotation at the segment C1/C2 less than 29 0 to hypomobility. According to the postulated normal values based upon a 98% confidence level, out of 137 patients examined after cervical spine injury and with therapy-resistant neck pain, 45 showed signs of segmental hypermobility of the upper cervical spine, 17 showed hyper- or hypomobility at different levels, 10 patients presented segmental hypomobility at C1/C2 level alone. In all patients, according to the clinical assessment, functional pathology was suspected in the upper cervical spine. Surgical correction of rotatory instability should be considered as a possible therapeutic procedure after successful diagnostic stabilisation of the cervical spine by minerva cast. (orig.)

  8. Functional diagnostics of the cervical spine by using computer tomography

    Dvorak, J; Hayek, J; Grob, D; Penning, L; Panjabi, M M; Zehnder, R

    1988-04-01

    35 healthy adults and 137 patients after cervical spine injury were examined by functional CT. The range of axial rotation at the level occiput/atlas, atlas/axis and the segment below were measured in all subjects. A rotation occiput/atlas of more than 7/sup 0/, and C1/C2 more than 54/sup 0/ could refer to segmental hypermobility, a rotation at the segment C1/C2 less than 29/sup 0/ to hypomobility. According to the postulated normal values based upon a 98% confidence level, out of 137 patients examined after cervical spine injury and with therapy-resistant neck pain, 45 showed signs of segmental hypermobility of the upper cervical spine, 17 showed hyper- or hypomobility at different levels, 10 patients presented segmental hypomobility at C1/C2 level alone. In all patients, according to the clinical assessment, functional pathology was suspected in the upper cervical spine. Surgical correction of rotatory instability should be considered as a possible therapeutic procedure after successful diagnostic stabilisation of the cervical spine by minerva cast.

  9. Algorithms: economical computation of functions of real matrices

    Weiss, Z.

    1991-01-01

    An algorithm is presented which economizes on the calculation of F(a), where A is a real matrix and F(x) a real valued function of x, using spectral analysis. Assuming the availability of the software for the calculation of the complete set of eigenvalues and eigen vectors of A, it is shown that the complex matrix arithmetics involved in subsequent operations leading from A to F(A) can be reduced to the size comparable with the analogous problem in real matrix arithmetics. Saving in CPU time and storage has been achieved by utilizing explicitly the property that complex eigenvalues of a real matrix appear in pairs of complex conjugated numbers. (author)

  10. Computational complexity of time-dependent density functional theory

    Whitfield, J D; Yung, M-H; Tempel, D G; Aspuru-Guzik, A; Boixo, S

    2014-01-01

    Time-dependent density functional theory (TDDFT) is rapidly emerging as a premier method for solving dynamical many-body problems in physics and chemistry. The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn–Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn–Sham system can be efficiently obtained given the time-dependent density. We introduce a V-representability parameter which diverges at the boundary of the existence domain and serves to quantify the numerical difficulty of constructing the Kohn-Sham potential. For bounded values of V-representability, we present a polynomial time quantum algorithm to generate the time-dependent Kohn–Sham potential with controllable error bounds. (paper)

  11. Target localization on standard axial images in computed tomography (CT) stereotaxis for functional neurosurgery - a technical note

    Patil, A.-A.

    1986-01-01

    A simple technique for marking functional neurosurgery target on computed tomography (CT) axial image is described. This permits the use of standard axial image for computed tomography (CT) stereotaxis in functional neurosurgery. (Author)

  12. Business Process Quality Computation : Computing Non-Functional Requirements to Improve Business Processes

    Heidari, F.

    2015-01-01

    Business process modelling is an important part of system design. When designing or redesigning a business process, stakeholders specify, negotiate, and agree on business requirements to be satisfied, including non-functional requirements that concern the quality of the business process. This thesis

  13. Renormalization group improved computation of correlation functions in theories with nontrivial phase diagram

    Codello, Alessandro; Tonero, Alberto

    2016-01-01

    We present a simple and consistent way to compute correlation functions in interacting theories with nontrivial phase diagram. As an example we show how to consistently compute the four-point function in three dimensional Z2-scalar theories. The idea is to perform the path integral by weighting...... the momentum modes that contribute to it according to their renormalization group (RG) relevance, i.e. we weight each mode according to the value of the running couplings at that scale. In this way, we are able to encode in a loop computation the information regarding the RG trajectory along which we...

  14. Construction of renormalized coefficient functions of the Feynman diagrams by means of a computer

    Tarasov, O.V.

    1978-01-01

    An algorithm and short description of computer program, written in SCHOONSCHIP, are given. The program is assigned for construction of integrands of renormalized coefficient functions of the Feynman diagrams in scalar theories in the case of arbitrary subtraction point. For the given Feynman graph computer completely realizes the R-operation of Bogolubov-Parasjuk and gives the result as an integral over Feynman parameters. With the help of the program the time construction of the whole renormalized coefficient function is equal approximately 30 s on the CDC-6500 computer

  15. COMPUTING

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  16. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  17. Computing the Kummer function $U(a,b,z)$ for small values of the arguments

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2015-01-01

    textabstractWe describe methods for computing the Kummer function $U(a,b,z)$ for small values of $z$, with special attention to small values of $b$. For these values of $b$ the connection formula that represents $U(a,b,z)$ as a linear combination of two ${}_1F_1$-functions needs a limiting

  18. Systemic functional grammar in natural language generation linguistic description and computational representation

    Teich, Elke

    1999-01-01

    This volume deals with the computational application of systemic functional grammar (SFG) for natural language generation. In particular, it describes the implementation of a fragment of the grammar of German in the computational framework of KOMET-PENMAN for multilingual generation. The text also presents a specification of explicit well-formedness constraints on syntagmatic structure which are defined in the form of typed feature structures. It thus achieves a model of systemic functional grammar that unites both the strengths of systemics, such as stratification, functional diversification

  19. Studies on the zeros of Bessel functions and methods for their computation

    Kerimov, M. K.

    2014-09-01

    The zeros of Bessel functions play an important role in computational mathematics, mathematical physics, and other areas of natural sciences. Studies addressing these zeros (their properties, computational methods) can be found in various sources. This paper offers a detailed overview of the results concerning the real zeros of the Bessel functions of the first and second kinds and general cylinder functions. The author intends to publish several overviews on this subject. In this first publication, works dealing with real zeros are analyzed. Primary emphasis is placed on classical results, which are still important. Some of the most recent publications are also discussed.

  20. Functioning strategy study on control systems of large physical installations used with a digital computer

    Bel'man, L.B.; Lavrikov, S.A.; Lenskij, O.D.

    1975-01-01

    A criterion to evaluate the efficiency of a control system functioning of large physical installations by means of a control computer. The criteria are the object utilization factor and computer load factor. Different strategies of control system functioning are described, and their comparative analysis is made. A choice of such important parameters as sampling time and parameter correction time is made. A single factor to evaluate the system functioning efficiency is introduced and its dependence on the sampling interval value is given. Using diagrams attached, it is easy to find the optimum value of the sampling interval and the corresponding maximum value of the single efficiency factor proposed

  1. NBS for Drought risks reduction in the Algarve (Portugal): selected achievements from PT FCT ProWaterMan and from EU FP7 MARSOL projects

    Lobo-Ferreira, João-Paulo

    2017-04-01

    Southern Europe and the Mediterranean region are facing the challenge of managing its water resources under conditions of increasing scarcity and/or floods, besides concerns about water quality. Innovative water management strategies with nature-based solutions, such as the storage of excess water during floods, in Managed Aquifer Recharge (MAR) schemes can greatly decrease the risk of floods while it increases water availability for future use, eventually in drought periods. The Algarve region is the southern most region of Portugal mainland. It has an area of 4,997 km2 and about 451 thousand permanent inhabitants. Selected achievements of two research projects (Portugal FCT sponsored ProWaterMan project and EU FP7 sponsored MARSOL project), will be addressed regarding the Campina de Faro and Querença-Silves aquifers in the Algarve. In Faro, the idea of harvesting rainwater from the greenhouse rooftops and using this water to recharge aquifers is not new. However, using this NbS as a climate mitigation and adaptation tool with the overall impact and wide range of benefits is a step forward in innovative methodologies. This NbS can have particular positive impacts in Mediterranean conditions with (new) precipitation patterns, more intense but less frequent. The potential greenhouses surface area of about 2.74 km2 can be used by connecting these infrastructures to several large wells aiming to infiltrate an amount of 1.63 hm3/year of harvested water. There is a strong support from the Portuguese Water Agency (Agência Portuguesa do Ambiente, I.P., Algarve branch), the Water Supplier and Wastewater (Águas do Algarve, S.A.), and the local farmers and land owners which have frequent flood and/or drought problems, represented by HUBEL (a SME that produces most greenhouses for Faro area) in this project. During EU 7FP INO-DEMO MARSOL project, a survey about protection and preservation of groundwater was conducted with a sample of Portuguese farmers of the Algarve

  2. Heuristic lipophilicity potential for computer-aided rational drug design: Optimizations of screening functions and parameters

    Du, Qishi; Mezey, Paul G.

    1998-09-01

    In this research we test and compare three possible atom-basedscreening functions used in the heuristic molecular lipophilicity potential(HMLP). Screening function 1 is a power distance-dependent function, b_{{i}} /| {R_{{i}}- r} |^γ, screening function 2is an exponential distance-dependent function, biexp(-| {R_i- r} |/d_0 , and screening function 3 is aweighted distance-dependent function, {{sign}}( {b_i } ){{exp}}ξ ( {| {R_i- r} |/| {b_i } |} )For every screening function, the parameters (γ ,d0, and ξ are optimized using 41 common organic molecules of 4 types of compounds:aliphatic alcohols, aliphatic carboxylic acids, aliphatic amines, andaliphatic alkanes. The results of calculations show that screening function3 cannot give chemically reasonable results, however, both the powerscreening function and the exponential screening function give chemicallysatisfactory results. There are two notable differences between screeningfunctions 1 and 2. First, the exponential screening function has largervalues in the short distance than the power screening function, thereforemore influence from the nearest neighbors is involved using screeningfunction 2 than screening function 1. Second, the power screening functionhas larger values in the long distance than the exponential screeningfunction, therefore screening function 1 is effected by atoms at longdistance more than screening function 2. For screening function 1, thesuitable range of parameter d0 is 1.5 < d0 < 3.0, and d0 = 2.0 is recommended. HMLP developed in this researchprovides a potential tool for computer-aided three-dimensional drugdesign.

  3. COMPUTING

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  4. Computation of Galois field expressions for quaternary logic functions on GPUs

    Gajić Dušan B.

    2014-01-01

    Full Text Available Galois field (GF expressions are polynomials used as representations of multiple-valued logic (MVL functions. For this purpose, MVL functions are considered as functions defined over a finite (Galois field of order p - GF(p. The problem of computing these functional expressions has an important role in areas such as digital signal processing and logic design. Time needed for computing GF-expressions increases exponentially with the number of variables in MVL functions and, as a result, it often represents a limiting factor in applications. This paper proposes a method for an accelerated computation of GF(4-expressions for quaternary (four-valued logic functions using graphics processing units (GPUs. The method is based on the spectral interpretation of GF-expressions, permitting the use of fast Fourier transform (FFT-like algorithms for their computation. These algorithms are then adapted for highly parallel processing on GPUs. The performance of the proposed solutions is compared with referent C/C++ implementations of the same algorithms processed on central processing units (CPUs. Experimental results confirm that the presented approach leads to significant reduction in processing times (up to 10.86 times when compared to CPU processing. Therefore, the proposed approach widens the set of problem instances which can be efficiently handled in practice. [Projekat Ministarstva nauke Republike Srbije, br. ON174026 i br. III44006

  5. Development and functional demonstration of a wireless intraoral inductive tongue computer interface for severely disabled persons.

    N S Andreasen Struijk, Lotte; Lontis, Eugen R; Gaihede, Michael; Caltenco, Hector A; Lund, Morten Enemark; Schioeler, Henrik; Bentsen, Bo

    2017-08-01

    Individuals with tetraplegia depend on alternative interfaces in order to control computers and other electronic equipment. Current interfaces are often limited in the number of available control commands, and may compromise the social identity of an individual due to their undesirable appearance. The purpose of this study was to implement an alternative computer interface, which was fully embedded into the oral cavity and which provided multiple control commands. The development of a wireless, intraoral, inductive tongue computer was described. The interface encompassed a 10-key keypad area and a mouse pad area. This system was embedded wirelessly into the oral cavity of the user. The functionality of the system was demonstrated in two tetraplegic individuals and two able-bodied individuals Results: The system was invisible during use and allowed the user to type on a computer using either the keypad area or the mouse pad. The maximal typing rate was 1.8 s for repetitively typing a correct character with the keypad area and 1.4 s for repetitively typing a correct character with the mouse pad area. The results suggest that this inductive tongue computer interface provides an esthetically acceptable and functionally efficient environmental control for a severely disabled user. Implications for Rehabilitation New Design, Implementation and detection methods for intra oral assistive devices. Demonstration of wireless, powering and encapsulation techniques suitable for intra oral embedment of assistive devices. Demonstration of the functionality of a rechargeable and fully embedded intra oral tongue controlled computer input device.

  6. Extended Krylov subspaces approximations of matrix functions. Application to computational electromagnetics

    Druskin, V.; Lee, Ping [Schlumberger-Doll Research, Ridgefield, CT (United States); Knizhnerman, L. [Central Geophysical Expedition, Moscow (Russian Federation)

    1996-12-31

    There is now a growing interest in the area of using Krylov subspace approximations to compute the actions of matrix functions. The main application of this approach is the solution of ODE systems, obtained after discretization of partial differential equations by method of lines. In the event that the cost of computing the matrix inverse is relatively inexpensive, it is sometimes attractive to solve the ODE using the extended Krylov subspaces, originated by actions of both positive and negative matrix powers. Examples of such problems can be found frequently in computational electromagnetics.

  7. Application of computer-generated functional (parametric) maps in radionuclide renography

    Agress, H. Jr.; Levenson, S.M.; Gelfand, M.J.; Green, M.V.; Bailey, J.J.; Johnston, G.S.

    1975-01-01

    A functional (parametric) map is a single visual display of regional dynamic phenomena which facilitates interpretation of the nature of focal abnormalities in renal function. Methods for producing several kinds of functional maps based on computer calculations of radionuclide scan data are briefly described. Three abnormal cases are presented to illustrate the use of functional maps to separate focal lesions and to specify the dynamic nature of the abnormalities in a way which is difficult to achieve with conventional sequential renal scans and renograms alone

  8. On the Computation and Applications of Bessel Functions with Pure Imaginary Indices

    Matyshev, A. A.; Fohtung, E.

    2009-01-01

    Bessel functions with pure imaginary index (order) play an important role in corpuscular optics where they govern the dynamics of charged particles in isotrajectory quadrupoles. Recently they were found to be of great importance in semiconductor material characterization as they are manifested in the strain state of crystalline material. A new algorithm which can be used for the computation of the normal and modifed Bessel functions with pure imaginary index is proposed. The developed algorit...

  9. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    Ranken, D.; George, J.

    1993-01-01

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities

  10. Applications of computed nuclear structure functions to inclusive scattering, R-ratios and their moments

    Rinat, A.S.

    2000-01-01

    We discuss applications of previously computed nuclear structure functions (SF) to inclusive cross sections, compare predictions with recent CEBAF data and perform two scaling tests. We mention that the large Q 2 plateau of scaling functions may only in part be due to the asymptotic limit of SF, which prevents the extraction of the nucleon momentum distribution in a model- independent way. We show that there may be sizable discrepancies between computed and semi-heuristic estimates of SF ratios. We compute ratios of moments of nuclear SF and show these to be in reasonable agreement with data. We speculate that an effective theory may underly the model for the nuclear SF, which produces overall agreement with several observables. (author)

  11. Computer-aided Nonlinear Control System Design Using Describing Function Models

    Nassirharand, Amir

    2012-01-01

    A systematic computer-aided approach provides a versatile setting for the control engineer to overcome the complications of controller design for highly nonlinear systems. Computer-aided Nonlinear Control System Design provides such an approach based on the use of describing functions. The text deals with a large class of nonlinear systems without restrictions on the system order, the number of inputs and/or outputs or the number, type or arrangement of nonlinear terms. The strongly software-oriented methods detailed facilitate fulfillment of tight performance requirements and help the designer to think in purely nonlinear terms, avoiding the expedient of linearization which can impose substantial and unrealistic model limitations and drive up the cost of the final product. Design procedures are presented in a step-by-step algorithmic format each step being a functional unit with outputs that drive the other steps. This procedure may be easily implemented on a digital computer with example problems from mecha...

  12. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  13. A fast computation method for MUSIC spectrum function based on circular arrays

    Du, Zhengdong; Wei, Ping

    2015-02-01

    The large computation amount of multiple signal classification (MUSIC) spectrum function seriously affects the timeliness of direction finding system using MUSIC algorithm, especially in the two-dimensional directions of arrival (DOA) estimation of azimuth and elevation with a large antenna array. This paper proposes a fast computation method for MUSIC spectrum. It is suitable for any circular array. First, the circular array is transformed into a virtual uniform circular array, in the process of calculating MUSIC spectrum, for the cyclic characteristics of steering vector, the inner product in the calculation of spatial spectrum is realised by cyclic convolution. The computational amount of MUSIC spectrum is obviously less than that of the conventional method. It is a very practical way for MUSIC spectrum computation in circular arrays.

  14. Computer-Based Techniques for Collection of Pulmonary Function Variables during Rest and Exercise.

    1991-03-01

    routinely Included in experimental protocols involving hyper- and hypobaric excursions. Unfortunately, the full potential of those tests Is often not...for a Pulmonary Function data acquisition system that has proven useful in the hyperbaric research laboratory. It illustrates how computers can

  15. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  16. Maple (Computer Algebra System) in Teaching Pre-Calculus: Example of Absolute Value Function

    Tuluk, Güler

    2014-01-01

    Modules in Computer Algebra Systems (CAS) make Mathematics interesting and easy to understand. The present study focused on the implementation of the algebraic, tabular (numerical), and graphical approaches used for the construction of the concept of absolute value function in teaching mathematical content knowledge along with Maple 9. The study…

  17. Effects of a Computer-Based Intervention Program on the Communicative Functions of Children with Autism

    Hetzroni, Orit E.; Tannous, Juman

    2004-01-01

    This study investigated the use of computer-based intervention for enhancing communication functions of children with autism. The software program was developed based on daily life activities in the areas of play, food, and hygiene. The following variables were investigated: delayed echolalia, immediate echolalia, irrelevant speech, relevant…

  18. On algorithmic equivalence of instruction sequences for computing bit string functions

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    Every partial function from bit strings of a given length to bit strings of a possibly different given length can be computed by a finite instruction sequence that contains only instructions to set and get the content of Boolean registers, forward jump instructions, and a termination instruction. We

  19. On algorithmic equivalence of instruction sequences for computing bit string functions

    Bergstra, J.A.; Middelburg, C.A.

    2014-01-01

    Every partial function from bit strings of a given length to bit strings of a possibly different given length can be computed by a finite instruction sequence that contains only instructions to set and get the content of Boolean registers, forward jump instructions, and a termination instruction. We

  20. Computer-mediated communication in adults with high-functioning autism spectrum disorders and controls

    van der Aa, Christine; Pollmann, Monique; Plaat, Aske; van der Gaag, Rutger Jan

    2016-01-01

    It has been suggested that people with Autism Spectrum Disorders (ASD) are attracted to computer-mediated communication (CMC). In this study, we compare CMC use in adults with high-functioning ASD (N = 113) and a control group (N = 72). We find that people with ASD spend more time on CMC than

  1. An analysis of functional communication training as an empirically supported treatment for problem behavior displayed by individuals with intellectual disabilities.

    Kurtz, Patricia F; Boelter, Eric W; Jarmolowicz, David P; Chin, Michelle D; Hagopian, Louis P

    2011-01-01

    This paper examines the literature on the use of functional communication training (FCT) as a treatment for problem behavior displayed by individuals with intellectual disabilities (ID). Criteria for empirically supported treatments developed by Divisions 12 and 16 of the American Psychological Association (Kratochwill & Stoiber, 2002; Task Force, 1995) and adapted by Jennett and Hagopian (2008) for evaluation of single-case research studies were used to examine the support for FCT. Results indicated that FCT far exceeds criteria to be designated as a well-established treatment for problem behavior exhibited by children with ID and children with autism spectrum disorder, and can be characterized as probably efficacious with adults. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Conical : An extended module for computing a numerically satisfactory pair of solutions of the differential equation for conical functions

    T.M. Dunster (Mark); A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2017-01-01

    textabstractConical functions appear in a large number of applications in physics and engineering. In this paper we describe an extension of our module Conical (Gil et al., 2012) for the computation of conical functions. Specifically, the module includes now a routine for computing the function

  3. The role of dual-energy computed tomography in the assessment of pulmonary function

    Hwang, Hye Jeon [Department of Radiology, Hallym University College of Medicine, Hallym University Sacred Heart Hospital, 22, Gwanpyeong-ro 170beon-gil, Dongan-gu, Anyang-si, Gyeonggi-do 431-796 (Korea, Republic of); Hoffman, Eric A. [Departments of Radiology, Medicine, and Biomedical Engineering, University of Iowa, 200 Hawkins Dr, CC 701 GH, Iowa City, IA 52241 (United States); Lee, Chang Hyun; Goo, Jin Mo [Department of Radiology, Seoul National University College of Medicine, 103 Daehak-ro, Jongno-gu, Seoul 110-799 (Korea, Republic of); Levin, David L. [Department of Radiology, Mayo Clinic College of Medicine, 200 First Street, SW, Rochester, MN 55905 (United States); Kauczor, Hans-Ulrich [Diagnostic and Interventional Radiology, University Hospital Heidelberg, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Seo, Joon Beom, E-mail: seojb@amc.seoul.kr [Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 388-1, Pungnap 2-dong, Songpa-ku, Seoul, 05505 (Korea, Republic of)

    2017-01-15

    Highlights: • The dual-energy CT technique enables the differentiation of contrast materials with material decomposition algorithm. • Pulmonary functional information can be evaluated using dual-energy CT with anatomic CT information, simultaneously. • Pulmonary functional information from dual-energy CT can improve diagnosis and severity assessment of diseases. - Abstract: The assessment of pulmonary function, including ventilation and perfusion status, is important in addition to the evaluation of structural changes of the lung parenchyma in various pulmonary diseases. The dual-energy computed tomography (DECT) technique can provide the pulmonary functional information and high resolution anatomic information simultaneously. The application of DECT for the evaluation of pulmonary function has been investigated in various pulmonary diseases, such as pulmonary embolism, asthma and chronic obstructive lung disease and so on. In this review article, we will present principles and technical aspects of DECT, along with clinical applications for the assessment pulmonary function in various lung diseases.

  4. On computation and use of Fourier coefficients for associated Legendre functions

    Gruber, Christian; Abrykosov, Oleh

    2016-06-01

    The computation of spherical harmonic series in very high resolution is known to be delicate in terms of performance and numerical stability. A major problem is to keep results inside a numerical range of the used data type during calculations as under-/overflow arises. Extended data types are currently not desirable since the arithmetic complexity will grow exponentially with higher resolution levels. If the associated Legendre functions are computed in the spectral domain, then regular grid transformations can be applied to be highly efficient and convenient for derived quantities as well. In this article, we compare three recursive computations of the associated Legendre functions as trigonometric series, thereby ensuring a defined numerical range for each constituent wave number, separately. The results to a high degree and order show the numerical strength of the proposed method. First, the evaluation of Fourier coefficients of the associated Legendre functions has been done with respect to the floating-point precision requirements. Secondly, the numerical accuracy in the cases of standard double and long double precision arithmetic is demonstrated. Following Bessel's inequality the obtained accuracy estimates of the Fourier coefficients are directly transferable to the associated Legendre functions themselves and to derived functionals as well. Therefore, they can provide an essential insight to modern geodetic applications that depend on efficient spherical harmonic analysis and synthesis beyond [5~× ~5] arcmin resolution.

  5. COMPUTING

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  6. Renal parenchyma thickness: a rapid estimation of renal function on computed tomography

    Kaplon, Daniel M.; Lasser, Michael S.; Sigman, Mark; Haleblian, George E.; Pareek, Gyan

    2009-01-01

    Purpose: To define the relationship between renal parenchyma thickness (RPT) on computed tomography and renal function on nuclear renography in chronically obstructed renal units (ORUs) and to define a minimal thickness ratio associated with adequate function. Materials and Methods: Twenty-eight consecutive patients undergoing both nuclear renography and CT during a six-month period between 2004 and 2006 were included. All patients that had a diagnosis of unilateral obstruction were included for analysis. RPT was measured in the following manner: The parenchyma thickness at three discrete levels of each kidney was measured using calipers on a CT workstation. The mean of these three measurements was defined as RPT. The renal parenchyma thickness ratio of the ORUs and non-obstructed renal unit (NORUs) was calculated and this was compared to the observed function on Mag-3 lasix Renogram. Results: A total of 28 patients were evaluated. Mean parenchyma thickness was 1.82 cm and 2.25 cm in the ORUs and NORUs, respectively. The mean relative renal function of ORUs was 39%. Linear regression analysis comparing renogram function to RPT ratio revealed a correlation coefficient of 0.48 (p * RPT ratio. A thickness ratio of 0.68 correlated with 20% renal function. Conclusion: RPT on computed tomography appears to be a powerful predictor of relative renal function in ORUs. Assessment of RPT is a useful and readily available clinical tool for surgical decision making (renal salvage therapy versus nephrectomy) in patients with ORUs. (author)

  7. Quantum computation and analysis of Wigner and Husimi functions: toward a quantum image treatment.

    Terraneo, M; Georgeot, B; Shepelyansky, D L

    2005-06-01

    We study the efficiency of quantum algorithms which aim at obtaining phase-space distribution functions of quantum systems. Wigner and Husimi functions are considered. Different quantum algorithms are envisioned to build these functions, and compared with the classical computation. Different procedures to extract more efficiently information from the final wave function of these algorithms are studied, including coarse-grained measurements, amplitude amplification, and measure of wavelet-transformed wave function. The algorithms are analyzed and numerically tested on a complex quantum system showing different behavior depending on parameters: namely, the kicked rotator. The results for the Wigner function show in particular that the use of the quantum wavelet transform gives a polynomial gain over classical computation. For the Husimi distribution, the gain is much larger than for the Wigner function and is larger with the help of amplitude amplification and wavelet transforms. We discuss the generalization of these results to the simulation of other quantum systems. We also apply the same set of techniques to the analysis of real images. The results show that the use of the quantum wavelet transform allows one to lower dramatically the number of measurements needed, but at the cost of a large loss of information.

  8. Three-dimensional computed tomographic volumetry precisely predicts the postoperative pulmonary function.

    Kobayashi, Keisuke; Saeki, Yusuke; Kitazawa, Shinsuke; Kobayashi, Naohiro; Kikuchi, Shinji; Goto, Yukinobu; Sakai, Mitsuaki; Sato, Yukio

    2017-11-01

    It is important to accurately predict the patient's postoperative pulmonary function. The aim of this study was to compare the accuracy of predictions of the postoperative residual pulmonary function obtained with three-dimensional computed tomographic (3D-CT) volumetry with that of predictions obtained with the conventional segment-counting method. Fifty-three patients scheduled to undergo lung cancer resection, pulmonary function tests, and computed tomography were enrolled in this study. The postoperative residual pulmonary function was predicted based on the segment-counting and 3D-CT volumetry methods. The predicted postoperative values were compared with the results of postoperative pulmonary function tests. Regarding the linear correlation coefficients between the predicted postoperative values and the measured values, those obtained using the 3D-CT volumetry method tended to be higher than those acquired using the segment-counting method. In addition, the variations between the predicted and measured values were smaller with the 3D-CT volumetry method than with the segment-counting method. These results were more obvious in COPD patients than in non-COPD patients. Our findings suggested that the 3D-CT volumetry was able to predict the residual pulmonary function more accurately than the segment-counting method, especially in patients with COPD. This method might lead to the selection of appropriate candidates for surgery among patients with a marginal pulmonary function.

  9. Computations of zeros of special functions and eigenvalues of differential equations by matrix method

    Miyazaki, Yoshinori

    2000-01-01

    This paper is strongly based on two powerful general theorems proved by Ikebe, et. al in 1993[15] and 1996[13], which will be referred to as Theorem A and Theorem B in this paper. They were recently published and justify the approximate computations of simple eigenvalues of infinite matrices of certain types by truncation, giving an extremely accurate error estimates. So far, they have applied to some important problems in engineering, such as computing the zeros of some special functions, an...

  10. Computer algebra in quantum field theory integration, summation and special functions

    Schneider, Carsten

    2013-01-01

    The book focuses on advanced computer algebra methods and special functions that have striking applications in the context of quantum field theory. It presents the state of the art and new methods for (infinite) multiple sums, multiple integrals, in particular Feynman integrals, difference and differential equations in the format of survey articles. The presented techniques emerge from interdisciplinary fields: mathematics, computer science and theoretical physics; the articles are written by mathematicians and physicists with the goal that both groups can learn from the other field, including

  11. FUNCTIONING FEATURES OF COMPUTER TECHNOLOGY WHILE FORMING PRIMARY SCHOOLCHILDREN’S COMMUNICATIVE COMPETENCE

    Olena Beskorsa

    2017-04-01

    Full Text Available The article reveals the problem of functioning features of computer technology while forming primary schoolchildren’s communicative competence whose relevance is proved by the increasing role of a foreign language as a means of communication and modernization of foreign language education. There is a great deal of publications devoted to the issue of foreign language learning at primary school by N. Biriukevych, O. Kolominova, O. Metolkina, O. Petrenko, V. Redko, S. Roman. Implementing of innovative technology as well as computer one is to intensify the language learning process and to improve young learners’ communicative skills. The aim of the article is to identify computer technology functioning features while forming primary schoolchildren communicative competence. In this study we follow the definition of the computer technology as an information technology whose implementation may be accompanied with a computer as one of the tools, excluding the use of audio and video equipment, projectors and other technical tools. Using computer technologies is realized due to a number of tools which are divided into two main groups: electronic learning materials; computer testing software. The analysis of current textbooks and learning and methodological complexes shows that teachers prefer authentic electronic materials to the national ones. The most available English learning materials are on the Internet and they are free. The author of the article discloses several on-line English learning tools and depict the opportunities to use them while forming primary schoolchildren’s communicative competence. Special attention is also paid to multimedia technology, its functioning features and multimedia lesson structure. Computer testing software provides tools for current and control assessing results of mastering language material, communicative skills, and self-assessing in an interactive way. For making tests for assessing English skill

  12. 76 FR 65197 - Statement of Organization, Functions, and Delegations of Authority

    2011-10-20

    ... Information and Insurance Oversight (FCR) Office of Public Engagement (FCS) Office of Communications (FCT... for Medicaid and CHIP Services (CMCS), and (2) realign the governmental relations function from the.... In conjunction with the Office of Public Engagement, oversees all CMS interactions and collaboration...

  13. Resolution function in deep inelastic neutron scattering using the Foil Cycling Technique

    Pietropaolo, A.; Andreani, C.; Filabozzi, A.; Pace, E.; Senesi, R.

    2007-01-01

    New perspectives for epithermal neutron spectroscopy are being opened up by the development of the Resonance Detector (RD) and its use on inverse geometry time of flight (TOF) spectrometers at spallation sources. The most recent result is the Foil Cycling Technique (FCT), which has been developed and applied on the VESUVIO spectrometer operating in the RD configuration. This technique has demonstrated its capability to improve the resolution function of the spectrometer and to provide an effective neutron and gamma background subtraction method. This paper reports a detailed analysis of the line shape of the resolution function in Deep Inelastic Neutron Scattering (DINS) measurements on VESUVIO spectrometer, operating in the RD configuration and employing the FCT. The aim is to provide an analytical approximation for the analyzer energy transfer function, an useful tool for data analysis on VESUVIO. Simulated and experimental results of DINS measurements on a lead sample are compared. The line shape analysis shows that the most reliable analytical approximation of the energy transfer function is a sum of a Gaussian and a power of a Lorentzian. A comparison with the Double Difference Method (DDM) is also discussed. It is shown that the energy resolution improvement for the FCT and the DDM is almost the same, while the counting efficiency is a factor of about 1.4 higher for the FCT

  14. Using Multiple Schedules during Functional Communication Training to Promote Rapid Transfer of Treatment Effects

    Fisher, Wayne W.; Greer, Brian D.; Fuhrman, Ashley M.; Querim, Angie C.

    2015-01-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and…

  15. Indirect Effects of Functional Communication Training on Non-Targeted Disruptive Behavior

    Schieltz, Kelly M.; Wacker, David P.; Harding, Jay W.; Berg, Wendy K.; Lee, John F.; Padilla Dalmau, Yaniz C.; Mews, Jayme; Ibrahimovic, Muska

    2011-01-01

    The purpose of this study was to evaluate the effects of functional communication training (FCT) on the occurrence of non-targeted disruptive behavior. The 10 participants were preschool-aged children with developmental disabilities who engaged in both destructive (property destruction, aggression, self-injury) and disruptive (hand flapping,…

  16. The Long-Term Effects of Functional Communication Training Conducted in Young Children's Home Settings

    Wacker, David P.; Schieltz, Kelly M.; Berg, Wendy K.; Harding, Jay W.; Padilla Dalmau, Yaniz C.; Lee, John F.

    2017-01-01

    This article describes the results of a series of studies that involved functional communication training (FCT) conducted in children's homes by their parents. The 103 children who participated were six years old or younger, had developmental delays, and engaged in destructive behaviors such as self-injury. The core procedures used in each study…

  17. Computational Methods for Large Spatio-temporal Datasets and Functional Data Ranking

    Huang, Huang

    2017-07-16

    This thesis focuses on two topics, computational methods for large spatial datasets and functional data ranking. Both are tackling the challenges of big and high-dimensional data. The first topic is motivated by the prohibitive computational burden in fitting Gaussian process models to large and irregularly spaced spatial datasets. Various approximation methods have been introduced to reduce the computational cost, but many rely on unrealistic assumptions about the process and retaining statistical efficiency remains an issue. We propose a new scheme to approximate the maximum likelihood estimator and the kriging predictor when the exact computation is infeasible. The proposed method provides different types of hierarchical low-rank approximations that are both computationally and statistically efficient. We explore the improvement of the approximation theoretically and investigate the performance by simulations. For real applications, we analyze a soil moisture dataset with 2 million measurements with the hierarchical low-rank approximation and apply the proposed fast kriging to fill gaps for satellite images. The second topic is motivated by rank-based outlier detection methods for functional data. Compared to magnitude outliers, it is more challenging to detect shape outliers as they are often masked among samples. We develop a new notion of functional data depth by taking the integration of a univariate depth function. Having a form of the integrated depth, it shares many desirable features. Furthermore, the novel formation leads to a useful decomposition for detecting both shape and magnitude outliers. Our simulation studies show the proposed outlier detection procedure outperforms competitors in various outlier models. We also illustrate our methodology using real datasets of curves, images, and video frames. Finally, we introduce the functional data ranking technique to spatio-temporal statistics for visualizing and assessing covariance properties, such as

  18. VAT: a computational framework to functionally annotate variants in personal genomes within a cloud-computing environment.

    Habegger, Lukas; Balasubramanian, Suganthi; Chen, David Z; Khurana, Ekta; Sboner, Andrea; Harmanci, Arif; Rozowsky, Joel; Clarke, Declan; Snyder, Michael; Gerstein, Mark

    2012-09-01

    The functional annotation of variants obtained through sequencing projects is generally assumed to be a simple intersection of genomic coordinates with genomic features. However, complexities arise for several reasons, including the differential effects of a variant on alternatively spliced transcripts, as well as the difficulty in assessing the impact of small insertions/deletions and large structural variants. Taking these factors into consideration, we developed the Variant Annotation Tool (VAT) to functionally annotate variants from multiple personal genomes at the transcript level as well as obtain summary statistics across genes and individuals. VAT also allows visualization of the effects of different variants, integrates allele frequencies and genotype data from the underlying individuals and facilitates comparative analysis between different groups of individuals. VAT can either be run through a command-line interface or as a web application. Finally, in order to enable on-demand access and to minimize unnecessary transfers of large data files, VAT can be run as a virtual machine in a cloud-computing environment. VAT is implemented in C and PHP. The VAT web service, Amazon Machine Image, source code and detailed documentation are available at vat.gersteinlab.org.

  19. Sequential designs for sensitivity analysis of functional inputs in computer experiments

    Fruth, J.; Roustant, O.; Kuhnt, S.

    2015-01-01

    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. - Highlights: • Sensitivity analysis method for functional and scalar inputs is presented. • We focus on the discovery of most influential parts of the functional domain. • We investigate economical sequential methodology based on piecewise constant functions. • Normalized sensitivity indices are introduced and investigated theoretically. • Successful application to sheet metal forming on two functional inputs

  20. Storing files in a parallel computing system based on user-specified parser function

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  1. Management of Liver Cancer Argon-helium Knife Therapy with Functional Computer Tomography Perfusion Imaging.

    Wang, Hongbo; Shu, Shengjie; Li, Jinping; Jiang, Huijie

    2016-02-01

    The objective of this study was to observe the change in blood perfusion of liver cancer following argon-helium knife treatment with functional computer tomography perfusion imaging. Twenty-seven patients with primary liver cancer treated with argon-helium knife and were included in this study. Plain computer tomography (CT) and computer tomography perfusion (CTP) imaging were conducted in all patients before and after treatment. Perfusion parameters including blood flows, blood volume, hepatic artery perfusion fraction, hepatic artery perfusion, and hepatic portal venous perfusion were used for evaluating therapeutic effect. All parameters in liver cancer were significantly decreased after argon-helium knife treatment (p knife therapy. Therefore, CTP imaging would play an important role for liver cancer management followed argon-helium knife therapy. © The Author(s) 2014.

  2. Implementation of the Two-Point Angular Correlation Function on a High-Performance Reconfigurable Computer

    Volodymyr V. Kindratenko

    2009-01-01

    Full Text Available We present a parallel implementation of an algorithm for calculating the two-point angular correlation function as applied in the field of computational cosmology. The algorithm has been specifically developed for a reconfigurable computer. Our implementation utilizes a microprocessor and two reconfigurable processors on a dual-MAP SRC-6 system. The two reconfigurable processors are used as two application-specific co-processors. Two independent computational kernels are simultaneously executed on the reconfigurable processors while data pre-fetching from disk and initial data pre-processing are executed on the microprocessor. The overall end-to-end algorithm execution speedup achieved by this implementation is over 90× as compared to a sequential implementation of the algorithm executed on a single 2.8 GHz Intel Xeon microprocessor.

  3. Cognitive assessment of executive functions using brain computer interface and eye-tracking

    P. Cipresso

    2013-03-01

    Full Text Available New technologies to enable augmentative and alternative communication in Amyotrophic Lateral Sclerosis (ALS have been recently used in several studies. However, a comprehensive battery for cognitive assessment has not been implemented yet. Brain computer interfaces are innovative systems able to generate a control signal from brain responses conveying messages directly to a computer. Another available technology for communication purposes is the Eye-tracker system, that conveys messages from eye-movement to a computer. In this study we explored the use of these two technologies for the cognitive assessment of executive functions in a healthy population and in a ALS patient, also verifying usability, pleasantness, fatigue, and emotional aspects related to the setting. Our preliminary results may have interesting implications for both clinical practice (the availability of an effective tool for neuropsychological evaluation of ALS patients and ethical issues.

  4. Can Expanded Bacteriochlorins Act as Photosensitizers in Photodynamic Therapy? Good News from Density Functional Theory Computations

    Gloria Mazzone

    2016-02-01

    Full Text Available The main photophysical properties of a series of expanded bacteriochlorins, recently synthetized, have been investigated by means of DFT and TD-DFT methods. Absorption spectra computed with different exchange-correlation functionals, B3LYP, M06 and ωB97XD, have been compared with the experimental ones. In good agreement, all the considered systems show a maximum absorption wavelength that falls in the therapeutic window (600–800 nm. The obtained singlet-triplet energy gaps are large enough to ensure the production of cytotoxic singlet molecular oxygen. The computed spin-orbit matrix elements suggest a good probability of intersystem spin-crossing between singlet and triplet excited states, since they result to be higher than those computed for 5,10,15,20-tetrakis-(m-hydroxyphenylchlorin (Foscan© already used in the photodynamic therapy (PDT protocol. Because of the investigated properties, these expanded bacteriochlorins can be proposed as PDT agents.

  5. Stochastic methods for uncertainty treatment of functional variables in computer codes: application to safety studies

    Nanty, Simon

    2015-01-01

    This work relates to the framework of uncertainty quantification for numerical simulators, and more precisely studies two industrial applications linked to the safety studies of nuclear plants. These two applications have several common features. The first one is that the computer code inputs are functional and scalar variables, functional ones being dependent. The second feature is that the probability distribution of functional variables is known only through a sample of their realizations. The third feature, relative to only one of the two applications, is the high computational cost of the code, which limits the number of possible simulations. The main objective of this work was to propose a complete methodology for the uncertainty analysis of numerical simulators for the two considered cases. First, we have proposed a methodology to quantify the uncertainties of dependent functional random variables from a sample of their realizations. This methodology enables to both model the dependency between variables and their link to another variable, called co-variate, which could be, for instance, the output of the considered code. Then, we have developed an adaptation of a visualization tool for functional data, which enables to simultaneously visualize the uncertainties and features of dependent functional variables. Second, a method to perform the global sensitivity analysis of the codes used in the two studied cases has been proposed. In the case of a computationally demanding code, the direct use of quantitative global sensitivity analysis methods is intractable. To overcome this issue, the retained solution consists in building a surrogate model or meta model, a fast-running model approximating the computationally expensive code. An optimized uniform sampling strategy for scalar and functional variables has been developed to build a learning basis for the meta model. Finally, a new approximation approach for expensive codes with functional outputs has been

  6. Automated Quantitative Computed Tomography Versus Visual Computed Tomography Scoring in Idiopathic Pulmonary Fibrosis: Validation Against Pulmonary Function.

    Jacob, Joseph; Bartholmai, Brian J; Rajagopalan, Srinivasan; Kokosi, Maria; Nair, Arjun; Karwoski, Ronald; Raghunath, Sushravya M; Walsh, Simon L F; Wells, Athol U; Hansell, David M

    2016-09-01

    The aim of the study was to determine whether a novel computed tomography (CT) postprocessing software technique (CALIPER) is superior to visual CT scoring as judged by functional correlations in idiopathic pulmonary fibrosis (IPF). A total of 283 consecutive patients with IPF had CT parenchymal patterns evaluated quantitatively with CALIPER and by visual scoring. These 2 techniques were evaluated against: forced expiratory volume in 1 second (FEV1), forced vital capacity (FVC), diffusing capacity for carbon monoxide (DLco), carbon monoxide transfer coefficient (Kco), and a composite physiological index (CPI), with regard to extent of interstitial lung disease (ILD), extent of emphysema, and pulmonary vascular abnormalities. CALIPER-derived estimates of ILD extent demonstrated stronger univariate correlations than visual scores for most pulmonary function tests (PFTs): (FEV1: CALIPER R=0.29, visual R=0.18; FVC: CALIPER R=0.41, visual R=0.27; DLco: CALIPER R=0.31, visual R=0.35; CPI: CALIPER R=0.48, visual R=0.44). Correlations between CT measures of emphysema extent and PFTs were weak and did not differ significantly between CALIPER and visual scoring. Intriguingly, the pulmonary vessel volume provided similar correlations to total ILD extent scored by CALIPER for FVC, DLco, and CPI (FVC: R=0.45; DLco: R=0.34; CPI: R=0.53). CALIPER was superior to visual scoring as validated by functional correlations with PFTs. The pulmonary vessel volume, a novel CALIPER CT parameter with no visual scoring equivalent, has the potential to be a CT feature in the assessment of patients with IPF and requires further exploration.

  7. Encoding neural and synaptic functionalities in electron spin: A pathway to efficient neuromorphic computing

    Sengupta, Abhronil; Roy, Kaushik

    2017-12-01

    Present day computers expend orders of magnitude more computational resources to perform various cognitive and perception related tasks that humans routinely perform every day. This has recently resulted in a seismic shift in the field of computation where research efforts are being directed to develop a neurocomputer that attempts to mimic the human brain by nanoelectronic components and thereby harness its efficiency in recognition problems. Bridging the gap between neuroscience and nanoelectronics, this paper attempts to provide a review of the recent developments in the field of spintronic device based neuromorphic computing. Description of various spin-transfer torque mechanisms that can be potentially utilized for realizing device structures mimicking neural and synaptic functionalities is provided. A cross-layer perspective extending from the device to the circuit and system level is presented to envision the design of an All-Spin neuromorphic processor enabled with on-chip learning functionalities. Device-circuit-algorithm co-simulation framework calibrated to experimental results suggest that such All-Spin neuromorphic systems can potentially achieve almost two orders of magnitude energy improvement in comparison to state-of-the-art CMOS implementations.

  8. Recent progress in orbital-free density functional theory (recent advances in computational chemistry)

    Wesolowski, Tomasz A

    2013-01-01

    This is a comprehensive overview of state-of-the-art computational methods based on orbital-free formulation of density functional theory completed by the most recent developments concerning the exact properties, approximations, and interpretations of the relevant quantities in density functional theory. The book is a compilation of contributions stemming from a series of workshops which had been taking place since 2002. It not only chronicles many of the latest developments but also summarises some of the more significant ones. The chapters are mainly reviews of sub-domains but also include original research. Readership: Graduate students, academics and researchers in computational chemistry. Atomic & molecular physicists, theoretical physicists, theoretical chemists, physical chemists and chemical physicists.

  9. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    Roccatano, Danilo

    2015-01-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure–dynamics–function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions. (topical review)

  10. Evaluation of the optimum region for mammographic system using computer simulation to study modulation transfer functions

    Oliveira, Isaura N. Sombra; Schiable, Homero; Porcel, Naider T.; Frere, Annie F.; Marques, Paulo M.A.

    1996-01-01

    An investigation of the 'optimum region' of the radiation field considering mammographic systems is studied. Such a region was defined in previous works as the field range where the system has its best performance and sharpest images. This study is based on a correlation of two methods for evaluating radiologic imaging systems, both using computer simulation in order to determine modulation transfer functions (MTFs) due to the X-ray tube focal spot in several field orientation and locations

  11. First results with twisted mass fermions towards the computation of parton distribution functions on the lattice

    Alexandrou, Constantia; Cyprus Institute, Nicosia; Deutsches Elektronen-Synchrotron; Cichy, Krzysztof; Poznan Univ.; Drach, Vincent; Garcia-Ramos, Elena; Humboldt-Universitaet, Berlin; Hadjiyiannakou, Kyriakos; Jansen, Karl; Steffens, Fernanda; Wiese, Christian

    2014-11-01

    We report on our exploratory study for the evaluation of the parton distribution functions from lattice QCD, based on a new method proposed in Ref.∝arXiv:1305.1539. Using the example of the nucleon, we compare two different methods to compute the matrix elements needed, and investigate the application of gauge link smearing. We also present first results from a large production ensemble and discuss the future challenges related to this method.

  12. Using computer graphics to preserve function in resection of malignant melanoma of the foot.

    Kaufman, M; Vantuyl, A; Japour, C; Ghosh, B C

    2001-08-01

    The increasing incidence of malignant melanoma challenges physicians to find innovative ways to preserve function and appearance in affected areas that require partial resection. We carefully planned the resection of a malignant lesion between the third and fourth toes of a 77-year-old man with the aid of computer technology. The subsequent excision of the third, fourth, and fifth digits was executed such that the new metatarsal arc formed would approximate the dimensions of the optimal hyperbola, thereby minimizing gait disturbance.

  13. Gaussian Radial Basis Function for Efficient Computation of Forest Indirect Illumination

    Abbas, Fayçal; Babahenini, Mohamed Chaouki

    2018-06-01

    Global illumination of natural scenes in real time like forests is one of the most complex problems to solve, because the multiple inter-reflections between the light and material of the objects composing the scene. The major problem that arises is the problem of visibility computation. In fact, the computing of visibility is carried out for all the set of leaves visible from the center of a given leaf, given the enormous number of leaves present in a tree, this computation performed for each leaf of the tree which also reduces performance. We describe a new approach that approximates visibility queries, which precede in two steps. The first step is to generate point cloud representing the foliage. We assume that the point cloud is composed of two classes (visible, not-visible) non-linearly separable. The second step is to perform a point cloud classification by applying the Gaussian radial basis function, which measures the similarity in term of distance between each leaf and a landmark leaf. It allows approximating the visibility requests to extract the leaves that will be used to calculate the amount of indirect illumination exchanged between neighbor leaves. Our approach allows efficiently treat the light exchanges in the scene of a forest, it allows a fast computation and produces images of good visual quality, all this takes advantage of the immense power of computation of the GPU.

  14. A new Fortran 90 program to compute regular and irregular associated Legendre functions (new version announcement)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2018-04-01

    This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.

  15. Computation of Value Functions in Nonlinear Differential Games with State Constraints

    Botkin, Nikolai

    2013-01-01

    Finite-difference schemes for the computation of value functions of nonlinear differential games with non-terminal payoff functional and state constraints are proposed. The solution method is based on the fact that the value function is a generalized viscosity solution of the corresponding Hamilton-Jacobi-Bellman-Isaacs equation. Such a viscosity solution is defined as a function satisfying differential inequalities introduced by M. G. Crandall and P. L. Lions. The difference with the classical case is that these inequalities hold on an unknown in advance subset of the state space. The convergence rate of the numerical schemes is given. Numerical solution to a non-trivial three-dimensional example is presented. © 2013 IFIP International Federation for Information Processing.

  16. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  17. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  18. Wigner functions and density matrices in curved spaces as computational tools

    Habib, S.; Kandrup, H.E.

    1989-01-01

    This paper contrasts two alternative approaches to statistical quantum field theory in curved spacetimes, namely (1) a canonical Hamiltonian approach, in which the basic object is a density matrix ρ characterizing the noncovariant, but globally defined, modes of the field; and (2) a Wigner function approach, in which the basic object is a Wigner function f defined quasilocally from the Hadamard, or correlation, function G 1 (x 1 , x 2 ). The key object is to isolate on the conceptual biases underlying each of these approaches and then to assess their utility and limitations in effecting concerete calculations. The following questions are therefore addressed and largely answered. What sort of spacetimes (e.g., de Sitter or Friedmann-Robertson-Walker) are comparatively eas to consider? What sorts of objects (e.g., average fields or renormalized stress energies) are easy to compute approximately? What, if anything, can be computed exactly? What approximations are intrinsic to each approach or convenient as computational tools? What sorts of ''field entropies'' are natural to define? copyright 1989 Academic Press, Inc

  19. Combining computer modelling and cardiac imaging to understand right ventricular pump function.

    Walmsley, John; van Everdingen, Wouter; Cramer, Maarten J; Prinzen, Frits W; Delhaas, Tammo; Lumens, Joost

    2017-10-01

    Right ventricular (RV) dysfunction is a strong predictor of outcome in heart failure and is a key determinant of exercise capacity. Despite these crucial findings, the RV remains understudied in the clinical, experimental, and computer modelling literature. This review outlines how recent advances in using computer modelling and cardiac imaging synergistically help to understand RV function in health and disease. We begin by highlighting the complexity of interactions that make modelling the RV both challenging and necessary, and then summarize the multiscale modelling approaches used to date to simulate RV pump function in the context of these interactions. We go on to demonstrate how these modelling approaches in combination with cardiac imaging have improved understanding of RV pump function in pulmonary arterial hypertension, arrhythmogenic right ventricular cardiomyopathy, dyssynchronous heart failure and cardiac resynchronization therapy, hypoplastic left heart syndrome, and repaired tetralogy of Fallot. We conclude with a perspective on key issues to be addressed by computational models of the RV in the near future. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  20. COMPUTING

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  1. Computational Fluid Dynamics Simulation of Combustion Instability in Solid Rocket Motor : Implementation of Pressure Coupled Response Function

    S. Saha; D. Chakraborty

    2016-01-01

    Combustion instability in solid propellant rocket motor is numerically simulated by implementing propellant response function with quasi steady homogeneous one dimensional formulation. The convolution integral of propellant response with pressure history is implemented through a user defined function in commercial computational fluid dynamics software. The methodology is validated against literature reported motor test and other simulation results. Computed amplitude of pressure fluctuations ...

  2. COMPUTING

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  3. COMPUTING

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  4. Cranial computed tomography associated with development of functional dependence in a community-based elderly population

    Tsukishima, Eri; Shido, Koichi

    2002-01-01

    The purpose of this study was to investigate whether changes at computed tomography (CT) imaging in the ageing brain are associated with future risks for functional dependence. One hundred sixty residents aged 69 years and older at the cranial CT and were independently living in a rural community in Hokkaido, Japan. Cranial CT was performed between 1991 and 1993, graded for ventricular enlargement, sulcal enlargement, white matter change, and small infarction. Functional status was reassessed in 1998 in each participant. Multiple logistic regression analysis was performed to estimate the association of CT changes in the ageing brain with development of functional dependence over six years. Functional dependence was found in 19 residents at the second survey. After adjusting for age, sex, medical conditions, and cognitive functioning, small infarction and ventricular enlargement were significantly associated with development of functional dependence (adjusted odds ratio=9.27 and 4.62). After controlling for age, the age-related changes on cranial CT have significant association on development of functional dependence. (author)

  5. Neuromorphological and wiring pattern alterations effects on brain function: a mixed experimental and computational approach.

    Linus Manubens-Gil

    2015-04-01

    In addition, the study of fixed intact brains (by means of the state of the art CLARITY technique brings us closer to biologically and medically relevant situations, allowing not only to confirm whether the functional links in neuronal cultures are also present in vivo, but also enabling the introduction of functional information (like behavioral studies and functional imaging and another layer of structural alterations such as brain region morphology, neuronal density, and long-range connectivity. Taking together the experimental information from these systems we want to feed self-developed computational models that allow us to understand what are the fundamental characteristics of the observed connectivity patterns and the impact of each of the alterations on neuronal network function. These models will also provide a framework able to account for the emergent properties that bridge the gap between spontaneous electrical activity arousal/transmission and higher order information processing and memory storage capacities in the brain. As an additional part of the project we are now working on the application of the clearing, labeling and imaging protocols to human biopsy samples. Our aim is to obtain neuronal architecture and connectivity information from focal cortical dysplasia microcircuits using samples from intractable temporal lobe epilepsy patients that undergo deep-brain electrode recording diagnosis and posterior surgical extraction of the tissue. Our computational models can allow us to discern the contributions of the observed abnormalities to neuronal hyperactivity and epileptic seizure generation.

  6. Computer-Based Cognitive Training for Executive Functions after Stroke: A Systematic Review

    van de Ven, Renate M.; Murre, Jaap M. J.; Veltman, Dick J.; Schmand, Ben A.

    2016-01-01

    Background: Stroke commonly results in cognitive impairments in working memory, attention, and executive function, which may be restored with appropriate training programs. Our aim was to systematically review the evidence for computer-based cognitive training of executive dysfunctions. Methods: Studies were included if they concerned adults who had suffered stroke or other types of acquired brain injury, if the intervention was computer training of executive functions, and if the outcome was related to executive functioning. We searched in MEDLINE, PsycINFO, Web of Science, and The Cochrane Library. Study quality was evaluated based on the CONSORT Statement. Treatment effect was evaluated based on differences compared to pre-treatment and/or to a control group. Results: Twenty studies were included. Two were randomized controlled trials that used an active control group. The other studies included multiple baselines, a passive control group, or were uncontrolled. Improvements were observed in tasks similar to the training (near transfer) and in tasks dissimilar to the training (far transfer). However, these effects were not larger in trained than in active control groups. Two studies evaluated neural effects and found changes in both functional and structural connectivity. Most studies suffered from methodological limitations (e.g., lack of an active control group and no adjustment for multiple testing) hampering differentiation of training effects from spontaneous recovery, retest effects, and placebo effects. Conclusions: The positive findings of most studies, including neural changes, warrant continuation of research in this field, but only if its methodological limitations are addressed. PMID:27148007

  7. A new algorithm to compute conjectured supply function equilibrium in electricity markets

    Diaz, Cristian A.; Villar, Jose; Campos, Fco Alberto; Rodriguez, M. Angel

    2011-01-01

    Several types of market equilibria approaches, such as Cournot, Conjectural Variation (CVE), Supply Function (SFE) or Conjectured Supply Function (CSFE) have been used to model electricity markets for the medium and long term. Among them, CSFE has been proposed as a generalization of the classic Cournot. It computes the equilibrium considering the reaction of the competitors against changes in their strategy, combining several characteristics of both CVE and SFE. Unlike linear SFE approaches, strategies are linearized only at the equilibrium point, using their first-order Taylor approximation. But to solve CSFE, the slope or the intercept of the linear approximations must be given, which has been proved to be very restrictive. This paper proposes a new algorithm to compute CSFE. Unlike previous approaches, the main contribution is that the competitors' strategies for each generator are initially unknown (both slope and intercept) and endogenously computed by this new iterative algorithm. To show the applicability of the proposed approach, it has been applied to several case examples where its qualitative behavior has been analyzed in detail. (author)

  8. COMPUTING

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation

    Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri

    2017-10-01

    We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.

  10. Computational medical imaging and hemodynamics framework for functional analysis and assessment of cardiovascular structures.

    Wong, Kelvin K L; Wang, Defeng; Ko, Jacky K L; Mazumdar, Jagannath; Le, Thu-Thao; Ghista, Dhanjoo

    2017-03-21

    Cardiac dysfunction constitutes common cardiovascular health issues in the society, and has been an investigation topic of strong focus by researchers in the medical imaging community. Diagnostic modalities based on echocardiography, magnetic resonance imaging, chest radiography and computed tomography are common techniques that provide cardiovascular structural information to diagnose heart defects. However, functional information of cardiovascular flow, which can in fact be used to support the diagnosis of many cardiovascular diseases with a myriad of hemodynamics performance indicators, remains unexplored to its full potential. Some of these indicators constitute important cardiac functional parameters affecting the cardiovascular abnormalities. With the advancement of computer technology that facilitates high speed computational fluid dynamics, the realization of a support diagnostic platform of hemodynamics quantification and analysis can be achieved. This article reviews the state-of-the-art medical imaging and high fidelity multi-physics computational analyses that together enable reconstruction of cardiovascular structures and hemodynamic flow patterns within them, such as of the left ventricle (LV) and carotid bifurcations. The combined medical imaging and hemodynamic analysis enables us to study the mechanisms of cardiovascular disease-causing dysfunctions, such as how (1) cardiomyopathy causes left ventricular remodeling and loss of contractility leading to heart failure, and (2) modeling of LV construction and simulation of intra-LV hemodynamics can enable us to determine the optimum procedure of surgical ventriculation to restore its contractility and health This combined medical imaging and hemodynamics framework can potentially extend medical knowledge of cardiovascular defects and associated hemodynamic behavior and their surgical restoration, by means of an integrated medical image diagnostics and hemodynamic performance analysis framework.

  11. Computational prediction of drug-drug interactions based on drugs functional similarities.

    Ferdousi, Reza; Safdari, Reza; Omidi, Yadollah

    2017-06-01

    Therapeutic activities of drugs are often influenced by co-administration of drugs that may cause inevitable drug-drug interactions (DDIs) and inadvertent side effects. Prediction and identification of DDIs are extremely vital for the patient safety and success of treatment modalities. A number of computational methods have been employed for the prediction of DDIs based on drugs structures and/or functions. Here, we report on a computational method for DDIs prediction based on functional similarity of drugs. The model was set based on key biological elements including carriers, transporters, enzymes and targets (CTET). The model was applied for 2189 approved drugs. For each drug, all the associated CTETs were collected, and the corresponding binary vectors were constructed to determine the DDIs. Various similarity measures were conducted to detect DDIs. Of the examined similarity methods, the inner product-based similarity measures (IPSMs) were found to provide improved prediction values. Altogether, 2,394,766 potential drug pairs interactions were studied. The model was able to predict over 250,000 unknown potential DDIs. Upon our findings, we propose the current method as a robust, yet simple and fast, universal in silico approach for identification of DDIs. We envision that this proposed method can be used as a practical technique for the detection of possible DDIs based on the functional similarities of drugs. Copyright © 2017. Published by Elsevier Inc.

  12. Computing single step operators of logic programming in radial basis function neural networks

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong [School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia)

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  13. Computing single step operators of logic programming in radial basis function neural networks

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  14. FUNCTIONALITY OF STUDENTS WITH PHYSICAL DEFICIENCY IN WRITING AND COMPUTER USE ACTIVITIES

    Fernanda Matrigani Mercado Gutierres de Queiroz

    2017-08-01

    Full Text Available The educational inclusion is focused on the learning of all students that have confronts barriers in to effective participation in the school life. In the inclusive education perspective, the students with disabilities must meet be served preferably in the regular education and the special education that needs to offer the educational attendance specialized to complement their educational needs. In this context, the objective of the research is defined in: Describe the functionality of students with physical disabilities, in the Multifunctional Resource Rooms, for activities of writing and computer use, according to the perception of the teachers. The participants of this analysis were teachers of the Educational Service Specialist that are serving students with disabilities. For data collection was used instrument School Function Assessment. The data were organized into a single document, the categories being presented. 1Written work; 2Use of the computer and the equipment. The conclusion was that students with physical disabilities, especially those with impaired upper-limb functionality can have find difficult to write using conventional materials, so they need Assistive Technology to develop their writing skills. Therefore, it is important to improve the profile analysis of the student, thus to choose the more appropriate resource as, it is necessary to improve the materials of the Multifunctional Resource Rooms to meet the diversity of all students with physical disabilities, since the type of furniture, didactic-pedagogical materials and equipment, are not favor in the use by students with serious motor disabilities.

  15. Computing single step operators of logic programming in radial basis function neural networks

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-01-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T p :I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks

  16. COMPUTING

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  17. COMPUTING

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  18. Efficient Server-Aided Secure Two-Party Function Evaluation with Applications to Genomic Computation

    Blanton Marina

    2016-10-01

    Full Text Available Computation based on genomic data is becoming increasingly popular today, be it for medical or other purposes. Non-medical uses of genomic data in a computation often take place in a server-mediated setting where the server offers the ability for joint genomic testing between the users. Undeniably, genomic data is highly sensitive, which in contrast to other biometry types, discloses a plethora of information not only about the data owner, but also about his or her relatives. Thus, there is an urgent need to protect genomic data. This is particularly true when the data is used in computation for what we call recreational non-health-related purposes. Towards this goal, in this work we put forward a framework for server-aided secure two-party computation with the security model motivated by genomic applications. One particular security setting that we treat in this work provides stronger security guarantees with respect to malicious users than the traditional malicious model. In particular, we incorporate certified inputs into secure computation based on garbled circuit evaluation to guarantee that a malicious user is unable to modify her inputs in order to learn unauthorized information about the other user’s data. Our solutions are general in the sense that they can be used to securely evaluate arbitrary functions and offer attractive performance compared to the state of the art. We apply the general constructions to three specific types of genomic tests: paternity, genetic compatibility, and ancestry testing and implement the constructions. The results show that all such private tests can be executed within a matter of seconds or less despite the large size of one’s genomic data.

  19. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  20. Probing the mutational interplay between primary and promiscuous protein functions: a computational-experimental approach.

    Garcia-Seisdedos, Hector; Ibarra-Molero, Beatriz; Sanchez-Ruiz, Jose M

    2012-01-01

    Protein promiscuity is of considerable interest due its role in adaptive metabolic plasticity, its fundamental connection with molecular evolution and also because of its biotechnological applications. Current views on the relation between primary and promiscuous protein activities stem largely from laboratory evolution experiments aimed at increasing promiscuous activity levels. Here, on the other hand, we attempt to assess the main features of the simultaneous modulation of the primary and promiscuous functions during the course of natural evolution. The computational/experimental approach we propose for this task involves the following steps: a function-targeted, statistical coupling analysis of evolutionary data is used to determine a set of positions likely linked to the recruitment of a promiscuous activity for a new function; a combinatorial library of mutations on this set of positions is prepared and screened for both, the primary and the promiscuous activities; a partial-least-squares reconstruction of the full combinatorial space is carried out; finally, an approximation to the Pareto set of variants with optimal primary/promiscuous activities is derived. Application of the approach to the emergence of folding catalysis in thioredoxin scaffolds reveals an unanticipated scenario: diverse patterns of primary/promiscuous activity modulation are possible, including a moderate (but likely significant in a biological context) simultaneous enhancement of both activities. We show that this scenario can be most simply explained on the basis of the conformational diversity hypothesis, although alternative interpretations cannot be ruled out. Overall, the results reported may help clarify the mechanisms of the evolution of new functions. From a different viewpoint, the partial-least-squares-reconstruction/Pareto-set-prediction approach we have introduced provides the computational basis for an efficient directed-evolution protocol aimed at the simultaneous

  1. Studies on the Zeroes of Bessel Functions and Methods for Their Computation: IV. Inequalities, Estimates, Expansions, etc., for Zeros of Bessel Functions

    Kerimov, M. K.

    2018-01-01

    This paper is the fourth in a series of survey articles concerning zeros of Bessel functions and methods for their computation. Various inequalities, estimates, expansions, etc. for positive zeros are analyzed, and some results are described in detail with proofs.

  2. Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging.

    Ohta, Shinri; Fukui, Naoki; Sakai, Kuniyoshi L

    2013-01-01

    The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties.

  3. Air trapping in sarcoidosis on computed tomography: Correlation with lung function

    Davies, C.W.H.; Tasker, A.D.; Padley, S.P.G.; Davies, R.J.O.; Gleeson, F.V.

    2000-01-01

    AIMS: To document the presence and extent of air trapping on high resolution computed tomography (HRCT) in patients with pulmonary sarcoidosis and correlate HRCT features with pulmonary function tests. METHODS: Twenty-one patients with pulmonary sarcoidosis underwent HRCT and pulmonary function assessment at presentation. Inspiratory and expiratory HRCT were assessed for the presence and extent of air trapping, ground-glass opacification, nodularity, septal thickening, bronchiectasis and parenchymal distortion. HRCT features were correlated with pulmonary function tests. RESULTS: Air trapping on expiratory HRCT was present in 20/21 (95%) patients. The extent of air trapping correlated with percentage predicted residual volume (RV)/total lung capacity (TLC) (r = 0.499;P < 0.05) and percentage predicted maximal mid-expiratory flow rate between 25 and 75% of the vital capacity (r = -0.54;P < 0.05). Ground-glass opacification was present in four of 21 (19%), nodularity in 18/21 (86%), septal thickening in 18/21 (86%), traction bronchiectasis in 14/21 (67%) and distortion in 12/21 (57%) of patients; there were no significant relationships between these CT features and pulmonary function results. CONCLUSION: Air trapping is a common feature in sarcoidosis and correlates with evidence of small airways disease on pulmonary function testing. Davies, C.W.H. (2000). Clinical Radiology 55, 217-221

  4. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth

  5. Fast and accurate three-dimensional point spread function computation for fluorescence microscopy.

    Li, Jizhou; Xue, Feng; Blu, Thierry

    2017-06-01

    The point spread function (PSF) plays a fundamental role in fluorescence microscopy. A realistic and accurately calculated PSF model can significantly improve the performance in 3D deconvolution microscopy and also the localization accuracy in single-molecule microscopy. In this work, we propose a fast and accurate approximation of the Gibson-Lanni model, which has been shown to represent the PSF suitably under a variety of imaging conditions. We express the Kirchhoff's integral in this model as a linear combination of rescaled Bessel functions, thus providing an integral-free way for the calculation. The explicit approximation error in terms of parameters is given numerically. Experiments demonstrate that the proposed approach results in a significantly smaller computational time compared with current state-of-the-art techniques to achieve the same accuracy. This approach can also be extended to other microscopy PSF models.

  6. COMPUTING

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  7. COMPUTING

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  8. COMPUTING

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  9. COMPUTING

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  10. COMPUTING

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  11. COMPUTING

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. COMPUTING

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  13. A Practical Computational Method for the Anisotropic Redshift-Space 3-Point Correlation Function

    Slepian, Zachary; Eisenstein, Daniel J.

    2018-04-01

    We present an algorithm enabling computation of the anisotropic redshift-space galaxy 3-point correlation function (3PCF) scaling as N2, with N the number of galaxies. Our previous work showed how to compute the isotropic 3PCF with this scaling by expanding the radially-binned density field around each galaxy in the survey into spherical harmonics and combining these coefficients to form multipole moments. The N2 scaling occurred because this approach never explicitly required the relative angle between a galaxy pair about the primary galaxy. Here we generalize this work, demonstrating that in the presence of azimuthally-symmetric anisotropy produced by redshift-space distortions (RSD) the 3PCF can be described by two triangle side lengths, two independent total angular momenta, and a spin. This basis for the anisotropic 3PCF allows its computation with negligible additional work over the isotropic 3PCF. We also present the covariance matrix of the anisotropic 3PCF measured in this basis. Our algorithm tracks the full 5-D redshift-space 3PCF, uses an accurate line of sight to each triplet, is exact in angle, and easily handles edge correction. It will enable use of the anisotropic large-scale 3PCF as a probe of RSD in current and upcoming large-scale redshift surveys.

  14. Morphological and Functional Evaluation of Quadricuspid Aortic Valves Using Cardiac Computed Tomography

    Song, Inyoung; Park, Jung Ah; Choi, Bo Hwa; Ko, Sung Min [Department of Radiology, Konkuk University Medical Center, Konkuk University School of Medicine, Seoul 05030 (Korea, Republic of); Shin, Je Kyoun; Chee, Hyun Keun; Kim, Jun Seok [Department of Thoracic Surgery, Konkuk University Medical Center, Konkuk University School of Medicine, Seoul 05030 (Korea, Republic of)

    2016-11-01

    The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV.

  15. Morphological and functional evaluation of quadricuspid aortic valves using cardiac computed tomography

    Song, In Young; Park, Jung Ah; Choi, Bo Hwa; Ko, Sung Min; Shin, Je Kyoun; Chee, Hyun Keun; KIm, Jun Seok [Konkuk University Medical Center, Konkuk University School of Medicine, Seoul (Korea, Republic of)

    2016-07-15

    The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV.

  16. Gravity-supported exercise with computer gaming improves arm function in chronic stroke.

    Jordan, Kimberlee; Sampson, Michael; King, Marcus

    2014-08-01

    To investigate the effect of 4 to 6 weeks of exergaming with a computer mouse embedded within an arm skate on upper limb function in survivors of chronic stroke. Intervention study with a 4-week postintervention follow-up. In home. Survivors (N=13) of chronic (≥6 mo) stroke with hemiparesis of the upper limb with stable baseline Fugl-Meyer assessment scores received the intervention. One participant withdrew, and 2 participants were not reassessed at the 4-week follow-up. No participants withdrew as a result of adverse effects. Four to 6 weeks of exergaming using the arm skate where participants received either 9 (n=5) or 16 (n=7) hours of game play. Upper limb component of the Fugl-Meyer assessment. There was an average increase in the Fugl-Meyer upper limb assessment score from the beginning to end of the intervention of 4.9 points. At the end of the 4-week period after the intervention, the increase was 4.4 points. A 4- to 6-week intervention using the arm skate significantly improved arm function in survivors of chronic stroke by an average of 4.9 Fugl-Meyer upper limb assessment points. This research shows that a larger-scale randomized trial of this device is warranted and highlights the potential value of using virtual reality technology (eg, computer games) in a rehabilitation setting. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  17. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  18. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  19. An Algorithm Computing the Local $b$ Function by an Approximate Division Algorithm in $\\hat{\\mathcal{D}}$

    Nakayama, Hiromasa

    2006-01-01

    We give an algorithm to compute the local $b$ function. In this algorithm, we use the Mora division algorithm in the ring of differential operators and an approximate division algorithm in the ring of differential operators with power series coefficient.

  20. Exact fast computation of band depth for large functional datasets: How quickly can one million curves be ranked?

    Sun, Ying; Genton, Marc G.; Nychka, Douglas W.

    2012-01-01

    © 2012 John Wiley & Sons, Ltd. Band depth is an important nonparametric measure that generalizes order statistics and makes univariate methods based on order statistics possible for functional data. However, the computational burden of band depth

  1. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  2. COMPUTING

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. COMPUTING

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  4. COMPUTING

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  5. COMPUTING

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  6. COMPUTING

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  7. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering

  8. A BASIC program for an IBM PC compatible computer for drawing the weak phase object contrast transfer function

    Olsen, A.; Skjerpe, P.

    1989-01-01

    This report describes a computer program which is useful in high resolution microscopy. The program is written in EBASIC and calculates the weak phase object contrast transfer function as function of instrumental and imaging parameters. The function is plotted on the PC graphics screen, and by a Print Screen command the function can be copied to the printer. The program runs on both the Hercules graphic card and the IBM CGA card. 2 figs

  9. COMPUTING

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. COPD phenotypes on computed tomography and its correlation with selected lung function variables in severe patients

    da Silva SMD

    2016-03-01

    Full Text Available Silvia Maria Doria da Silva, Ilma Aparecida Paschoal, Eduardo Mello De Capitani, Marcos Mello Moreira, Luciana Campanatti Palhares, Mônica Corso PereiraPneumology Service, Department of Internal Medicine, School of Medical Sciences, State University of Campinas (UNICAMP, Campinas, São Paulo, BrazilBackground: Computed tomography (CT phenotypic characterization helps in understanding the clinical diversity of chronic obstructive pulmonary disease (COPD patients, but its clinical relevance and its relationship with functional features are not clarified. Volumetric capnography (VC uses the principle of gas washout and analyzes the pattern of CO2 elimination as a function of expired volume. The main variables analyzed were end-tidal concentration of carbon dioxide (ETCO2, Slope of phase 2 (Slp2, and Slope of phase 3 (Slp3 of capnogram, the curve which represents the total amount of CO2 eliminated by the lungs during each breath.Objective: To investigate, in a group of patients with severe COPD, if the phenotypic analysis by CT could identify different subsets of patients, and if there was an association of CT findings and functional variables.Subjects and methods: Sixty-five patients with COPD Gold III–IV were admitted for clinical evaluation, high-resolution CT, and functional evaluation (spirometry, 6-minute walk test [6MWT], and VC. The presence and profusion of tomography findings were evaluated, and later, the patients were identified as having emphysema (EMP or airway disease (AWD phenotype. EMP and AWD groups were compared; tomography findings scores were evaluated versus spirometric, 6MWT, and VC variables.Results: Bronchiectasis was found in 33.8% and peribronchial thickening in 69.2% of the 65 patients. Structural findings of airways had no significant correlation with spirometric variables. Air trapping and EMP were strongly correlated with VC variables, but in opposite directions. There was some overlap between the EMP and AWD

  11. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization

  12. Passive Stretch Induces Structural and Functional Maturation of Engineered Heart Muscle as Predicted by Computational Modeling.

    Abilez, Oscar J; Tzatzalos, Evangeline; Yang, Huaxiao; Zhao, Ming-Tao; Jung, Gwanghyun; Zöllner, Alexander M; Tiburcy, Malte; Riegler, Johannes; Matsa, Elena; Shukla, Praveen; Zhuge, Yan; Chour, Tony; Chen, Vincent C; Burridge, Paul W; Karakikes, Ioannis; Kuhl, Ellen; Bernstein, Daniel; Couture, Larry A; Gold, Joseph D; Zimmermann, Wolfram H; Wu, Joseph C

    2018-02-01

    The ability to differentiate human pluripotent stem cells (hPSCs) into cardiomyocytes (CMs) makes them an attractive source for repairing injured myocardium, disease modeling, and drug testing. Although current differentiation protocols yield hPSC-CMs to >90% efficiency, hPSC-CMs exhibit immature characteristics. With the goal of overcoming this limitation, we tested the effects of varying passive stretch on engineered heart muscle (EHM) structural and functional maturation, guided by computational modeling. Human embryonic stem cells (hESCs, H7 line) or human induced pluripotent stem cells (IMR-90 line) were differentiated to hPSC-derived cardiomyocytes (hPSC-CMs) in vitro using a small molecule based protocol. hPSC-CMs were characterized by troponin + flow cytometry as well as electrophysiological measurements. Afterwards, 1.2 × 10 6 hPSC-CMs were mixed with 0.4 × 10 6 human fibroblasts (IMR-90 line) (3:1 ratio) and type-I collagen. The blend was cast into custom-made 12-mm long polydimethylsiloxane reservoirs to vary nominal passive stretch of EHMs to 5, 7, or 9 mm. EHM characteristics were monitored for up to 50 days, with EHMs having a passive stretch of 7 mm giving the most consistent formation. Based on our initial macroscopic observations of EHM formation, we created a computational model that predicts the stress distribution throughout EHMs, which is a function of cellular composition, cellular ratio, and geometry. Based on this predictive modeling, we show cell alignment by immunohistochemistry and coordinated calcium waves by calcium imaging. Furthermore, coordinated calcium waves and mechanical contractions were apparent throughout entire EHMs. The stiffness and active forces of hPSC-derived EHMs are comparable with rat neonatal cardiomyocyte-derived EHMs. Three-dimensional EHMs display increased expression of mature cardiomyocyte genes including sarcomeric protein troponin-T, calcium and potassium ion channels, β-adrenergic receptors, and t

  13. Chronic hypersensitivity pneumonitis: high resolution computed tomography patterns and pulmonary function indices as prognostic determinants

    Walsh, Simon L.F.; Devaraj, Anand; Hansell, David M. [Royal Brompton Hospital, Department of Radiology, London (United Kingdom); Sverzellati, Nicola [University of Parma, Department of Clinical Sciences, Section of Radiology, Parma (Italy); Wells, Athol U. [Royal Brompton Hospital, Interstitial Lung Diseases Unit, London (United Kingdom)

    2012-08-15

    To investigate high resolution computed tomography (HRCT) and pulmonary function indices (PFTs) for determining prognosis in patients with chronic fibrotic hypersensitivity pneumonitis (CHP). Case records, PFTs (FEV{sub 1}, FVC and DLco) and HRCTs of ninety-two patients with chronic hypersensitivity pneumonitis were evaluated. HRCT studies were scored by two observers for total disease extent, ground-glass opacification, fine and coarse reticulation, microcystic and macrocystic honeycombing, centrilobular emphysema and consolidation. Traction bronchiectasis within each pattern was graded. Using Cox proportional hazards regression models the prognostic strength of individual HRCT patterns and pulmonary function test variables were determined. There were forty two deaths during the study period. Increasing severity of traction bronchiectasis was the strongest predictor of mortality (HR 1.10, P < 0.001, 95%CI 1.04-1.16). Increasing global interstitial disease extent (HR 1.02, P = 0.02, 95%CI 1.00-1.03), microcystic honeycombing (HR 1.09, P = 0.019, 95%CI 1.01-1.17) and macrocystic honeycombing (HR 1.06, P < 0.01, 95%CI 1.01-1.10) were also independent predictors of mortality. In contrast, no individual PFT variable was predictive of mortality once HRCT patterns were accounted for. HRCT patterns, in particular, severity of traction bronchiectasis and extent of honeycombing are superior to pulmonary function tests for predicting mortality in patients with CHP. (orig.)

  14. The relationship between lung function impairment and quantitative computed tomography in chronic obstructive pulmonary disease

    Mets, O.M. [Radiology, University Medical Center Utrecht (Netherlands); University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Murphy, K. [Image Sciences Institute, University Medical Center Utrecht (Netherlands); Zanen, P.; Lammers, J.W. [Pulmonology, University Medical Center Utrecht (Netherlands); Gietema, H.A.; Jong, P.A. de [Radiology, University Medical Center Utrecht (Netherlands); Ginneken, B. van [Image Sciences Institute, University Medical Center Utrecht (Netherlands); Radboud University Nijmegen Medical Centre, Diagnostic Image Analysis Group, Radiology, Nijmegen (Netherlands); Prokop, M. [Radiology, University Medical Center Utrecht (Netherlands); Radiology, Radboud University Nijmegen Medical Centre (Netherlands)

    2012-01-15

    To determine the relationship between lung function impairment and quantitative computed tomography (CT) measurements of air trapping and emphysema in a population of current and former heavy smokers with and without airflow limitation. In 248 subjects (50 normal smokers; 50 mild obstruction; 50 moderate obstruction; 50 severe obstruction; 48 very severe obstruction) CT emphysema and CT air trapping were quantified on paired inspiratory and end-expiratory CT examinations using several available quantification methods. CT measurements were related to lung function (FEV{sub 1}, FEV{sub 1}/FVC, RV/TLC, Kco) by univariate and multivariate linear regression analysis. Quantitative CT measurements of emphysema and air trapping were strongly correlated to airflow limitation (univariate r-squared up to 0.72, p < 0.001). In multivariate analysis, the combination of CT emphysema and CT air trapping explained 68-83% of the variability in airflow limitation in subjects covering the total range of airflow limitation (p < 0.001). The combination of quantitative CT air trapping and emphysema measurements is strongly associated with lung function impairment in current and former heavy smokers with a wide range of airflow limitation. (orig.)

  15. Ensemble-based computational approach discriminates functional activity of p53 cancer and rescue mutants.

    Özlem Demir

    2011-10-01

    Full Text Available The tumor suppressor protein p53 can lose its function upon single-point missense mutations in the core DNA-binding domain ("cancer mutants". Activity can be restored by second-site suppressor mutations ("rescue mutants". This paper relates the functional activity of p53 cancer and rescue mutants to their overall molecular dynamics (MD, without focusing on local structural details. A novel global measure of protein flexibility for the p53 core DNA-binding domain, the number of clusters at a certain RMSD cutoff, was computed by clustering over 0.7 µs of explicitly solvated all-atom MD simulations. For wild-type p53 and a sample of p53 cancer or rescue mutants, the number of clusters was a good predictor of in vivo p53 functional activity in cell-based assays. This number-of-clusters (NOC metric was strongly correlated (r(2 = 0.77 with reported values of experimentally measured ΔΔG protein thermodynamic stability. Interpreting the number of clusters as a measure of protein flexibility: (i p53 cancer mutants were more flexible than wild-type protein, (ii second-site rescue mutations decreased the flexibility of cancer mutants, and (iii negative controls of non-rescue second-site mutants did not. This new method reflects the overall stability of the p53 core domain and can discriminate which second-site mutations restore activity to p53 cancer mutants.

  16. The relationship between lung function impairment and quantitative computed tomography in chronic obstructive pulmonary disease

    Mets, O.M.; Murphy, K.; Zanen, P.; Lammers, J.W.; Gietema, H.A.; Jong, P.A. de; Ginneken, B. van; Prokop, M.

    2012-01-01

    To determine the relationship between lung function impairment and quantitative computed tomography (CT) measurements of air trapping and emphysema in a population of current and former heavy smokers with and without airflow limitation. In 248 subjects (50 normal smokers; 50 mild obstruction; 50 moderate obstruction; 50 severe obstruction; 48 very severe obstruction) CT emphysema and CT air trapping were quantified on paired inspiratory and end-expiratory CT examinations using several available quantification methods. CT measurements were related to lung function (FEV 1 , FEV 1 /FVC, RV/TLC, Kco) by univariate and multivariate linear regression analysis. Quantitative CT measurements of emphysema and air trapping were strongly correlated to airflow limitation (univariate r-squared up to 0.72, p < 0.001). In multivariate analysis, the combination of CT emphysema and CT air trapping explained 68-83% of the variability in airflow limitation in subjects covering the total range of airflow limitation (p < 0.001). The combination of quantitative CT air trapping and emphysema measurements is strongly associated with lung function impairment in current and former heavy smokers with a wide range of airflow limitation. (orig.)

  17. Chronic hypersensitivity pneumonitis: high resolution computed tomography patterns and pulmonary function indices as prognostic determinants

    Walsh, Simon L.F.; Devaraj, Anand; Hansell, David M.; Sverzellati, Nicola; Wells, Athol U.

    2012-01-01

    To investigate high resolution computed tomography (HRCT) and pulmonary function indices (PFTs) for determining prognosis in patients with chronic fibrotic hypersensitivity pneumonitis (CHP). Case records, PFTs (FEV 1 , FVC and DLco) and HRCTs of ninety-two patients with chronic hypersensitivity pneumonitis were evaluated. HRCT studies were scored by two observers for total disease extent, ground-glass opacification, fine and coarse reticulation, microcystic and macrocystic honeycombing, centrilobular emphysema and consolidation. Traction bronchiectasis within each pattern was graded. Using Cox proportional hazards regression models the prognostic strength of individual HRCT patterns and pulmonary function test variables were determined. There were forty two deaths during the study period. Increasing severity of traction bronchiectasis was the strongest predictor of mortality (HR 1.10, P < 0.001, 95%CI 1.04-1.16). Increasing global interstitial disease extent (HR 1.02, P = 0.02, 95%CI 1.00-1.03), microcystic honeycombing (HR 1.09, P = 0.019, 95%CI 1.01-1.17) and macrocystic honeycombing (HR 1.06, P < 0.01, 95%CI 1.01-1.10) were also independent predictors of mortality. In contrast, no individual PFT variable was predictive of mortality once HRCT patterns were accounted for. HRCT patterns, in particular, severity of traction bronchiectasis and extent of honeycombing are superior to pulmonary function tests for predicting mortality in patients with CHP. (orig.)

  18. Comparison of measured and computed phase functions of individual tropospheric ice crystals

    Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin

    2016-01-01

    Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert. - Highlights: • A GO code for shaped beams and non-spherical particles has been developed. • The code has been validated against exact Mie results. • Measured and computed phase functions for a single ice crystal have been compared. • The comparison highlights differences in the backscattering region.

  19. Application of FCT to Incompressible Flows

    Liu, Junhui; Kaplan, Carolyn R; Mott, David R; Oran, Elaine S

    2006-01-01

    .... Since an odd-even decoupling instability arises in standard algorithms that update a pressure correction in the Poisson equation, we have avoided this instability by using an intermediate velocity...

  20. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    Chan, Micaela Y; Haber, Sara; Drew, Linda M; Park, Denise C

    2016-06-01

    Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America.

  1. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    Snyder, Abigail C.; Jiao, Yu

    2010-01-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.

  2. Computational modeling of heterogeneity and function of CD4+ T cells

    Adria eCarbo

    2014-07-01

    Full Text Available The immune system is composed of many different cell types and hundreds of intersecting molecular pathways and signals. This large biological complexity requires coordination between distinct pro-inflammatory and regulatory cell subsets to respond to infection while maintaining tissue homeostasis. CD4+ T cells play a central role in orchestrating immune responses and in maintaining a balance between pro- and anti- inflammatory responses. This tight balance between regulatory and effector reactions depends on the ability of CD4+ T cells to modulate distinct pathways within large molecular networks, since dysregulated CD4+ T cell responses may result in chronic inflammatory and autoimmune diseases. The CD4+ T cell differentiation process comprises an intricate interplay between cytokines, their receptors, adaptor molecules, signaling cascades and transcription factors that help delineate cell fate and function. Computational modeling can help to describe, simulate, analyze, and predict some of the behaviors in this complicated differentiation network. This review provides a comprehensive overview of existing computational immunology methods as well as novel strategies used to model immune responses with a particular focus on CD4+ T cell differentiation.

  3. Computer based training for NPP personnel (interactive communication systems and functional trainers)

    Martin, H.D.

    1987-01-01

    KWU as a manufacturer of thermal and nuclear power plants has extensive customer training obligations within its power plant contracts. In this respect KWU has gained large experience in training of personnel, in the production of training material including video tapes an in the design of simulators. KWU developed interactive communication systems (ICS) for training and retraining purposes with a personal computer operating a video disc player on which video instruction is stored. The training program is edited with the help of a self developed editing system which enables the author to easily enter his instructions into the computer. ICS enables the plant management to better monitor the performance of its personnel through computerized training results and helps to save training manpower. German NPPs differ very much from other designs with respect to a more complex and integrated reactor control system and an additional reactor limitation system. Simulators for such plants therefore have also to simulate these systems. KWU developed a Functional Trainer (FT) which is a replica of the primary system, the auxiliary systems linked to it and the associated control, limitation and protection systems including the influences of the turbine operation and control

  4. Computational screening of functionalized zinc porphyrins for dye sensitized solar cells

    Ørnsø, Kristian Baruël; García Lastra, Juan Maria; Thygesen, Kristian Sommer

    2013-01-01

    separation, and high output voltage. Here we demonstrate an extensive computational screening of zinc porphyrins functionalized with electron donating side groups and electron accepting anchoring groups. The trends in frontier energy levels versus side groups are analyzed and a no-loss DSSC level alignment...... quality is estimated. Out of the initial 1029 molecules, we find around 50 candidates with level alignment qualities within 5% of the optimal limit. We show that the level alignment of five zinc porphyrin dyes which were recently used in DSSCs with high efficiencies can be further improved by simple side......An efficient dye sensitized solar cell (DSSC) is one possible solution to meet the world's rapidly increasing energy demands and associated climate challenges. This requires inexpensive and stable dyes with well-positioned frontier energy levels for maximal solar absorption, efficient charge...

  5. Functional high-resolution computed tomography of pulmonary vascular and airway reactions

    Herold, C.J.; Johns Hopkins Medical Institutions, Baltimore, MD; Brown, R.H.; Johns Hopkins Medical Institutions, Baltimore, MD; Johns Hopkins Medical Institutions, Baltimore, MD; Wetzel, R.C.; Herold, S.M.; Zeerhouni, E.A.

    1993-01-01

    We describe the use of high-resolution computed tomography (HRCT) for assessment of the function of pulmonary vessels and airways. With its excellent spatial resolution, HRCT is able to demonstrate pulmonary structures as small as 300 μm and can be used to monitor changes following various stimuli. HRCT also provides information about structures smaller than 300 μm through measurement of parenchymal background density. To date, sequential, spiral and ultrafast HRCT techniques have been used in a variety of challenges to gather information about the anatomical correlates of traditional physiological measurements, thus making anatomical-physiological correlation possible. HRCT of bronchial reactivity can demonstrate the location and time course of aerosol-induced broncho-constriction and may show changes not apparent on spirometry. HRCT of the pulmonary vascular system visualized adaptations of vessels during hypoxia and intravascular volume loading and elucidates cardiorespiratory interactions. Experimental studies provide a basis for potential clinical applications of this method. (orig.) [de

  6. Evaluation of the modulation transfer function for computer tomography by using American Association Physics Medicine Phantom

    Kim, Ki Won [Dept. of Radiology, Kyung Hee University Hospital at Gang-dong, Seoul (Korea, Republic of); Choi, Kwan Woo [Dept. of Radiology, Asan Medical Center, Seoul (Korea, Republic of); Jeong, Hoi Woun [Dept. of Radiological Technology, Baekseok Culture University, Cheonan (Korea, Republic of); Jang, Seo Goo [Dept. of Medical Science, Soonchunhyang University, Asan (Korea, Republic of); Kwon, Kyung Tae [Dept. of Radiological Technology, Dongnam Health University, Suwon (Korea, Republic of); Son, Soon Yong [Dept. of Radiological Technology, Wonkwang Health Science University, Iksan (Korea, Republic of); Son, Jin Hyun; Min, Jung Whan [Dept. of Radiological Technology, Shingu University, Sungnam (Korea, Republic of)

    2016-06-15

    In clinical computed tomography (CT), regular quality assurance (QA) has been required. This study is to evaluate the MTF for analyzing the spatial resolution using AAPM phantom in CT exam. The dual source somatom definition flash (siemens healthcare, forchheim, Germany), the brilliance 64 (philips medical system Netherlands) and aquilion 64 (toshiba medical system, Japan) were used in this study. The quantitative evaluation was performed using the image J (wayne rasband national institutes of health, USA) and chart method which is measurement of modulation transfer function (MTF). In MTF evaluation, the spatial frequencies corresponding to the 50% MTF for the CT systems were 0.58, 0.28, and 0.59 mm-1, respectively and the 10% MTF for the CT systems were 1.63, 0.89, and 1.21 mm-1, respectively. This study could evaluate the characteristic of spatial resolution of MTF using chart method, suggesting the quantitative evaluation method using the data.

  7. Computer Modelling of Functional Aspects of Noise in Endogenously Oscillating Neurons

    Huber, M. T.; Dewald, M.; Voigt, K.; Braun, H. A.; Moss, F.

    1998-03-01

    Membrane potential oscillations are a widespread feature of neuronal activity. When such oscillations operate close to the spike-triggering threshold, noise can become an essential property of spike-generation. According to that, we developed a minimal Hodgkin-Huxley-type computer model which includes a noise term. This model accounts for experimental data from quite different cells ranging from mammalian cortical neurons to fish electroreceptors. With slight modifications of the parameters, the model's behavior can be tuned to bursting activity, which additionally allows it to mimick temperature encoding in peripheral cold receptors including transitions to apparently chaotic dynamics as indicated by methods for the detection of unstable periodic orbits. Under all conditions, cooperative effects between noise and nonlinear dynamics can be shown which, beyond stochastic resonance, might be of functional significance for stimulus encoding and neuromodulation.

  8. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Quantitative Analysis of Lateral Pinch Force in Quadriplegic Patients Using Functional Neuromuscular Stimulation with Computer Stimulation

    Ali Esteki

    2004-10-01

    Full Text Available Objective: In some applications of functional neuromuscular stimulation (FNS, the distal joint of the thumb (IP in quadriplegic patients is sometimes surgically fused at zero degrees and the FPL is stimulated. This prevents hyperextension and extreme flexion of the IP joint during lateral pinch. However, IP joint fusion removes one degree of freedom from the thumb and may reduce the grip force. An alternative approach, preferably without surgical alterations, using sufficient electrical stimulation of selected muscles was investigated. A 3D model of prehensile lateral pinch was developed. Computer simulation of the model was used to find an approach providing the appropriate posture and adequate lateral grip force for quadriplegic patients using FNS. Materials & Methods: The model consists of a multi-rigid-body system connected by one or two degree(s of freedom joints acted upon by passive resistive moments, active muscle moments and moments of external contact forces. Passive resistive moments were measured at each joint, active muscle moments were computed using a simple muscle model, and moments of external force were computed based on a force-displacement relationship for finger pads. In addition to the current strategy, two possible alternatives were studied: increasing the fused joint angle and activation of multiple muscles without joint fusion. Normal component of the grip force and its angle with respect to the horizontal plane were computed and compared for the studied cases. Results: Results showed, by using the current FNS strategy, a convenient posture and a grip force of 10.1 (N are achieved which is comparable to what is measured experimentally and introduced in the literature. Increasing the joint fusion angle from 0 to 15 and 30 degrees in parallel with the activation of FPL increased the grip force from 10.1 to 10.7 and 11.2 (N, respectively, but resulted in inconvenient posture. Among all different combinations of the muscles

  10. Computing many-body wave functions with guaranteed precision: the first-order Møller-Plesset wave function for the ground state of helium atom.

    Bischoff, Florian A; Harrison, Robert J; Valeev, Edward F

    2012-09-14

    We present an approach to compute accurate correlation energies for atoms and molecules using an adaptive discontinuous spectral-element multiresolution representation for the two-electron wave function. Because of the exponential storage complexity of the spectral-element representation with the number of dimensions, a brute-force computation of two-electron (six-dimensional) wave functions with high precision was not practical. To overcome the key storage bottlenecks we utilized (1) a low-rank tensor approximation (specifically, the singular value decomposition) to compress the wave function, and (2) explicitly correlated R12-type terms in the wave function to regularize the Coulomb electron-electron singularities of the Hamiltonian. All operations necessary to solve the Schrödinger equation were expressed so that the reconstruction of the full-rank form of the wave function is never necessary. Numerical performance of the method was highlighted by computing the first-order Møller-Plesset wave function of a helium atom. The computed second-order Møller-Plesset energy is precise to ~2 microhartrees, which is at the precision limit of the existing general atomic-orbital-based approaches. Our approach does not assume special geometric symmetries, hence application to molecules is straightforward.

  11. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  12. High-resolution computed tomography in silicosis: correlation with chest radiography and pulmonary function tests

    Lopes, Agnaldo Jose [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Pedro Ernesto Univ. Hospital. Dept. of Respiratory Function]. E-mail: phel.lop@uol.com.br; Mogami, Roberto; Capone, Domenico; Jansen, Jose Manoel [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). School of Medical Sciences; Tessarollo, Bernardo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Dept. of Radiology and Diagnostic Image; Melo, Pedro Lopes de [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Inst. of Biology

    2008-05-15

    Objective: To correlate tomographic findings with pulmonary function findings, as well as to compare chest X-ray findings with high-resolution computed tomography (HRCT) findings, in patients with silicosis. Methods: A cross-sectional study was conducted in 44 non-smoking patients without a history of tuberculosis. Chest X-ray findings were classified according to the International Labour Organization recommendations. Using a semiquantitative system, the following HRCT findings were measured: the full extent of pulmonary involvement; parenchymal opacities; and emphysema. Spirometry and forced oscillation were performed. Pulmonary volumes were evaluated using the helium dilution method, and diffusing capacity of the lung for carbon monoxide was assessed. Results: Of the 44 patients studied, 41 were male. The mean age was 48.4 years. There were 4 patients who were classified as category 0 based on X-ray findings and as category 1 based on HRCT findings. Using HRCT scans, we identified progressive massive fibrosis in 33 patients, compared with only 23 patients when X-rays were used. Opacity score was found to correlate most closely with airflow, DLCO and compliance. Emphysema score correlated inversely with volume, DLCO and airflow. In this sample of patients presenting a predominance of large opacities (75% of the individuals), the deterioration of pulmonary function was associated with the extent of structural changes. Conclusions: In the early detection of silicosis and the identification of progressive massive fibrosis, HRCT scans are superior to X-rays. (author)

  13. High-resolution computed tomography in silicosis: correlation with chest radiography and pulmonary function tests

    Lopes, Agnaldo Jose; Mogami, Roberto; Capone, Domenico; Jansen, Jose Manoel; Tessarollo, Bernardo; Melo, Pedro Lopes de

    2008-01-01

    Objective: To correlate tomographic findings with pulmonary function findings, as well as to compare chest X-ray findings with high-resolution computed tomography (HRCT) findings, in patients with silicosis. Methods: A cross-sectional study was conducted in 44 non-smoking patients without a history of tuberculosis. Chest X-ray findings were classified according to the International Labour Organization recommendations. Using a semiquantitative system, the following HRCT findings were measured: the full extent of pulmonary involvement; parenchymal opacities; and emphysema. Spirometry and forced oscillation were performed. Pulmonary volumes were evaluated using the helium dilution method, and diffusing capacity of the lung for carbon monoxide was assessed. Results: Of the 44 patients studied, 41 were male. The mean age was 48.4 years. There were 4 patients who were classified as category 0 based on X-ray findings and as category 1 based on HRCT findings. Using HRCT scans, we identified progressive massive fibrosis in 33 patients, compared with only 23 patients when X-rays were used. Opacity score was found to correlate most closely with airflow, DLCO and compliance. Emphysema score correlated inversely with volume, DLCO and airflow. In this sample of patients presenting a predominance of large opacities (75% of the individuals), the deterioration of pulmonary function was associated with the extent of structural changes. Conclusions: In the early detection of silicosis and the identification of progressive massive fibrosis, HRCT scans are superior to X-rays. (author)

  14. Computational-based structural, functional and phylogenetic analysis of Enterobacter phytases.

    Pramanik, Krishnendu; Kundu, Shreyasi; Banerjee, Sandipan; Ghosh, Pallab Kumar; Maiti, Tushar Kanti

    2018-06-01

    Myo-inositol hexakisphosphate phosphohydrolases (i.e., phytases) are known to be a very important enzyme responsible for solubilization of insoluble phosphates. In the present study, Enterobacter phytases have characterized by different phylogenetic, structural and functional parameters using some standard bio-computational tools. Results showed that majority of the Enterobacter phytases are acidic in nature as most of the isoelectric points were under 7.0. The aliphatic indices predicted for the selected proteins were below 40 indicating their thermostable nature. The average molecular weight of the proteins was 48 kDa. The lower values of GRAVY of the said proteins implied that they have better interactions with water. Secondary structure prediction revealed that alpha-helical content was highest among the other forms such as sheets, coils, etc. Moreover, the predicted 3D structure of Enterobacter phytases divulged that the proteins consisted of four monomeric polypeptide chains i.e., it was a tetrameric protein. The predicted tertiary model of E. aerogenes (A0A0M3HCJ2) was deposited in Protein Model Database (Acc. No.: PM0080561) for further utilization after a thorough quality check from QMEAN and SAVES server. Functional analysis supported their classification as histidine acid phosphatases. Besides, multiple sequence alignment revealed that "DG-DP-LG" was the most highly conserved residues within the Enterobacter phytases. Thus, the present study will be useful in selecting suitable phytase-producing microbe exclusively for using in the animal food industry as a food additive.

  15. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  16. What are the ideal properties for functional food peptides with antihypertensive effect? A computational peptidology approach.

    Zhou, Peng; Yang, Chao; Ren, Yanrong; Wang, Congcong; Tian, Feifei

    2013-12-01

    Peptides with antihypertensive potency have long been attractive to the medical and food communities. However, serving as food additives, rather than therapeutic agents, peptides should have a good taste. In the present study, we explore the intrinsic relationship between the angiotensin I-converting enzyme (ACE) inhibition and bitterness of short peptides in the framework of computational peptidology, attempting to find out the appropriate properties for functional food peptides with satisfactory bioactivities. As might be expected, quantitative structure-activity relationship modeling reveals a significant positive correlation between the ACE inhibition and bitterness of dipeptides, but this correlation is quite modest for tripeptides and, particularly, tetrapeptides. Moreover, quantum mechanics/molecular mechanics analysis of the structural basis and energetic profile involved in ACE-peptide complexes unravels that peptides of up to 4 amino acids long are sufficient to have efficient binding to ACE, and more additional residues do not bring with substantial enhance in their ACE-binding affinity and, thus, antihypertensive capability. All of above, it is coming together to suggest that the tripeptides and tetrapeptides could be considered as ideal candidates for seeking potential functional food additives with both high antihypertensive activity and low bitterness. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Correlation of pulmonary function and usual interstitial pneumonia computed tomography patterns in idiopathic pulmonary fibrosis.

    Arcadu, Antonella; Byrne, Suzanne C; Pirina, Pietro; Hartman, Thomas E; Bartholmai, Brian J; Moua, Teng

    2017-08-01

    Little is known about presenting 'inconsistent' or 'possible' usual interstitial pneumonia (UIP) computed tomography (CT) patterns advancing to 'consistent' UIP as disease progresses in idiopathic pulmonary fibrosis (IPF). We hypothesized that if 'consistent' UIP represented more advanced disease, such a pattern on presentation should also correlate with more severe pulmonary function test (PFT) abnormalities. Consecutive IPF patients (2005-2013) diagnosed by international criteria with baseline PFT and CT were included. Presenting CTs were assessed by three expert radiologists for consensus UIP pattern ('consistent', 'possible', and 'inconsistent'). Approximation of individual and combined interstitial abnormalities was also performed with correlation of interstitial abnormalities and UIP CT pattern made with PFT findings and survival. Three-hundred and fifty patients (70% male) were included with a mean age of 68.3 years. Mean percent predicted forced vital capacity (FVC%) and diffusion capacity (DLCO%) was 64% and 45.5% respectively. Older age and male gender correlated more with 'consistent' UIP CT pattern. FVC% was not associated with any UIP pattern but did correlate with total volume of radiologist assessed interstitial abnormalities. DLCO% was lower in those with 'consistent' UIP pattern. A 'consistent' UIP CT pattern was also not independently predictive of survival after correction for age, gender, FVC%, and DLCO%. PFT findings appear to correlate with extent of radiologic disease but not specific morphologic patterns. Whether such UIP patterns represent different stages of disease severity or radiologic progression is not supported by coinciding pulmonary function decline. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Development of a mobile gammacamera computer system for non invasive ventricular function determination

    Knopp, R.; Reske, S.N.; Winkler, C.

    1983-03-01

    As a reliable non-invasive method, dynamic ventricular volume determination by means of gammacamera computer scintigraphy is now generally accepted to be most useful in clinical cardiology. In view to the fact, however, that the required instrumentation is in general unwieldy and not mobile sophisticated cardiac function studies could not be performed up to now in many intensive care units. In order to overcome this problem we developed a compact scintigraphic system consisting of a mobile gammacamera (Siemens Mobicon) with a conductive build-in minicomputer (Siemens R 20: 16 bit, 128 kB). It renders possible a combined investigation of ventricular volume and pressure. The volume curve is acquired by sequential scintigrahpy whereas the pessure is simultaneously measured manometrically by means of heart catheter. As a result of this comprehensive investigation a pressure-volume loop is plottes the enclosed area of which represents the cardiac work performance. Additionally, functional parameters such as compliance (dV/dp) or stiffness (dp/dV) can be derived from the loop diagram. Besides of the mentioned procedures, the mobile system can also be used for detection of acute infarctions as well as for myocardial scintigraphy in general. (orig.) [de

  19. Flash X-Ray Apparatus With Spectrum Control Functions For Medical Use And Fuji Computed Radiography

    Isobe, H.; Sato, E.; Hayasi, Y.; Suzuki, M.; Arima, H.; Hoshino, F.

    1985-02-01

    Flash radiographic bio-medical studies at sub-microsecond intervals were performed by using both a new type of flash X-ray(FX) apparatus with spectrum control functions and Fuji Computed Radiography(FCR). This single flasher tends to have a comparatively long exposure time and the electric pulse width of the FX wave form is about 0.3,usec. The maximum FX dose is about 50mR at 1m per pulse, and the effective focal spot varies according to condenser charging voltage, A-C distance, etc., ranging from 1.0 to 3.0mm in diameter, but in the low dose rate region it can be reduced to less than 1.0mm in diameter. The FX dose is determined by the condenser charging voltage and the A-C distance, while the FX spectrum is determined by the average voltage of the FX tube and filters. Various clear FX images were obtained by controlling the spectrum and dose. FCR is a new storage medium for medical radiography developed by the Fuji Photo Film Co., Ltd. and this apparatus has various image forming functions: low dose radiography, film density control, image contrast control, subtraction management and others. We have used this new apparatus in conjunction with our FX radiography and have obtained some new and interesting biomedical radiograms: the edge enhancement image, the instantaneous enlarged image, and the single exposure energy subtraction image using the FX spectrum distribution.

  20. Use of time space Green's functions in the computation of transient eddy current fields

    Davey, K.; Turner, L.

    1988-01-01

    The utility of integral equations to solve eddy current problems has been borne out by numerous computations in the past few years, principally in sinusoidal steady-state problems. This paper attempts to examine the applicability of the integral approaches in both time and space for the more generic transient problem. The basic formulation for the time space Green's function approach is laid out. A technique employing Gauss-Laguerre integration is employed to realize the temporal solution, while Gauss--Legendre integration is used to resolve the spatial field character. The technique is then applied to the fusion electromagnetic induction experiments (FELIX) cylinder experiments in both two and three dimensions. It is found that quite accurate solutions can be obtained using rather coarse time steps and very few unknowns; the three-dimensional field solution worked out in this context used basically only four unknowns. The solution appears to be somewhat sensitive to the choice of time step, a consequence of a numerical instability imbedded in the Green's function near the origin

  1. Crown-ether functionalized carbon nanotubes for purification of lithium compounds: computational and experimental study

    Singha Deb, A.K.; Arora, S.K.; Joshi, J.M.; Ali, Sk. M.; Shenoy, K.T.; Goyal, Aiana

    2015-01-01

    Lithium compounds finds several applications in nuclear science and technology, viz, lithium fluoride/hydroxide/alloys are used as dosimetric materials in luminescence devices, molten-salt breeder reactor, international thermonuclear experimental reactor, single crystal based neutron detectors etc. The lithium compounds should be in a proper state of purity; especially it should not contain other alkali metal cations which can downgrade the performance. Hence, there is a need to develop a process for purification of the lithium salt to achieve the desired quality. Therefore an attempt has been made to develop advanced nanomaterials for purification of the lithium salts. In this work, benzo-15-crown-5(B15C5) functionalized carbon nanotubes (CNTs), owing to the good adsorption properties of CNT and alkali metal encapsulation behaviour of B15C5, were showed to bind preferentially with sodium and potassium ions compared to lithium ions. DFT based computation calculations have shown that the free energy of complexation of Na + and K + by B15C5-CNT is higher than that of Li + , implying that B15C5-CNT selectively binds Na + and K + . The experimental batch solid-liquid extraction has also revealed the same trend as in the calculations. The crown-ethers functionalized CNTs have the potentiality for use in purifying lithium compounds. (author)

  2. Brain-Computer Interface Controlled Functional Electrical Stimulation System for Ankle Movement

    King Christine E

    2011-08-01

    Full Text Available Abstract Background Many neurological conditions, such as stroke, spinal cord injury, and traumatic brain injury, can cause chronic gait function impairment due to foot-drop. Current physiotherapy techniques provide only a limited degree of motor function recovery in these individuals, and therefore novel therapies are needed. Brain-computer interface (BCI is a relatively novel technology with a potential to restore, substitute, or augment lost motor behaviors in patients with neurological injuries. Here, we describe the first successful integration of a noninvasive electroencephalogram (EEG-based BCI with a noninvasive functional electrical stimulation (FES system that enables the direct brain control of foot dorsiflexion in able-bodied individuals. Methods A noninvasive EEG-based BCI system was integrated with a noninvasive FES system for foot dorsiflexion. Subjects underwent computer-cued epochs of repetitive foot dorsiflexion and idling while their EEG signals were recorded and stored for offline analysis. The analysis generated a prediction model that allowed EEG data to be analyzed and classified in real time during online BCI operation. The real-time online performance of the integrated BCI-FES system was tested in a group of five able-bodied subjects who used repetitive foot dorsiflexion to elicit BCI-FES mediated dorsiflexion of the contralateral foot. Results Five able-bodied subjects performed 10 alternations of idling and repetitive foot dorsifiexion to trigger BCI-FES mediated dorsifiexion of the contralateral foot. The epochs of BCI-FES mediated foot dorsifiexion were highly correlated with the epochs of voluntary foot dorsifiexion (correlation coefficient ranged between 0.59 and 0.77 with latencies ranging from 1.4 sec to 3.1 sec. In addition, all subjects achieved a 100% BCI-FES response (no omissions, and one subject had a single false alarm. Conclusions This study suggests that the integration of a noninvasive BCI with a lower

  3. Brain-computer interface controlled functional electrical stimulation system for ankle movement.

    Do, An H; Wang, Po T; King, Christine E; Abiri, Ahmad; Nenadic, Zoran

    2011-08-26

    Many neurological conditions, such as stroke, spinal cord injury, and traumatic brain injury, can cause chronic gait function impairment due to foot-drop. Current physiotherapy techniques provide only a limited degree of motor function recovery in these individuals, and therefore novel therapies are needed. Brain-computer interface (BCI) is a relatively novel technology with a potential to restore, substitute, or augment lost motor behaviors in patients with neurological injuries. Here, we describe the first successful integration of a noninvasive electroencephalogram (EEG)-based BCI with a noninvasive functional electrical stimulation (FES) system that enables the direct brain control of foot dorsiflexion in able-bodied individuals. A noninvasive EEG-based BCI system was integrated with a noninvasive FES system for foot dorsiflexion. Subjects underwent computer-cued epochs of repetitive foot dorsiflexion and idling while their EEG signals were recorded and stored for offline analysis. The analysis generated a prediction model that allowed EEG data to be analyzed and classified in real time during online BCI operation. The real-time online performance of the integrated BCI-FES system was tested in a group of five able-bodied subjects who used repetitive foot dorsiflexion to elicit BCI-FES mediated dorsiflexion of the contralateral foot. Five able-bodied subjects performed 10 alternations of idling and repetitive foot dorsifiexion to trigger BCI-FES mediated dorsifiexion of the contralateral foot. The epochs of BCI-FES mediated foot dorsifiexion were highly correlated with the epochs of voluntary foot dorsifiexion (correlation coefficient ranged between 0.59 and 0.77) with latencies ranging from 1.4 sec to 3.1 sec. In addition, all subjects achieved a 100% BCI-FES response (no omissions), and one subject had a single false alarm. This study suggests that the integration of a noninvasive BCI with a lower-extremity FES system is feasible. With additional modifications

  4. Functional magnetic resonance maps obtained by personal computer; Mapas de resonancia magnetica funcional obtenidos con PC

    Gomez, F. j.; Manjon, J. V.; Robles, M. [Universidad Politecnica de Valencia (Spain); Marti-Bonmati, L.; Dosda, R. [Cinica Quiron. Valencia (Spain); Molla, E. [Universidad de Valencia (Spain)

    2001-07-01

    Functional magnetic resonance (fMR) is of special relevance in the analysis of certain types of brain activation. The present report describes the development of a simple software program for use with personal computers (PCs) that analyzes these images and provides functional activation maps. Activation maps are based on the temporal differences in oxyhemoglobin in tomographic images. To detect these differences, intensities registered repeatedly during brain control and activation are compared. The experiments were performed with a 1.5-Tesla MR unit. To verify the reliability of the program fMR studies were carried out in 4 healthy individuals (12 contiguous slices, 80 images per slice every 3.1 seconds for a total of 960 images). All the images were transferred to a PC and were processed pixel by pixel within each sequence to obtain an intensity/time curve. The statistical study of the results (Student's test and cross correlation analysis) made it possible to establish the activation of each pixel. The images were prepared using spatial filtering, temporal filtering, baseline correction, normalization and segmentation of the parenchyma. The postprocessing of the results involved the elimination of single pixels, superposition of an anatomical image of greater spatial resolution and anti-aliasing. The application (Xfun 1.0, Valencia, Spain) was developed in Microsoft Visual C++5.0 Developer Studio for Windows NT Workstation. As a representative example, the program took 8.2 seconds to calculate and present the results of the entire study (12 functional maps). In the motor and visual activation experiments, the activation corresponding to regions proximal to the central sulcus of the hemisphere contralateral to the hand that moved and in the occipital cortex were observed. While programs that calculate activation maps are available, the development of software for PCs running Microsoft Windows ensures several key features for its use on a daily basis: it is

  5. The Effect of Neurocognitive Function on Math Computation in Pediatric ADHD: Moderating Influences of Anxious Perfectionism and Gender.

    Sturm, Alexandra; Rozenman, Michelle; Piacentini, John C; McGough, James J; Loo, Sandra K; McCracken, James T

    2018-03-20

    Predictors of math achievement in attention-deficit/hyperactivity disorder (ADHD) are not well-known. To address this gap in the literature, we examined individual differences in neurocognitive functioning domains on math computation in a cross-sectional sample of youth with ADHD. Gender and anxiety symptoms were explored as potential moderators. The sample consisted of 281 youth (aged 8-15 years) diagnosed with ADHD. Neurocognitive tasks assessed auditory-verbal working memory, visuospatial working memory, and processing speed. Auditory-verbal working memory speed significantly predicted math computation. A three-way interaction revealed that at low levels of anxious perfectionism, slower processing speed predicted poorer math computation for boys compared to girls. These findings indicate the uniquely predictive values of auditory-verbal working memory and processing speed on math computation, and their differential moderation. These findings provide preliminary support that gender and anxious perfectionism may influence the relationship between neurocognitive functioning and academic achievement.

  6. Exact fast computation of band depth for large functional datasets: How quickly can one million curves be ranked?

    Sun, Ying

    2012-10-01

    © 2012 John Wiley & Sons, Ltd. Band depth is an important nonparametric measure that generalizes order statistics and makes univariate methods based on order statistics possible for functional data. However, the computational burden of band depth limits its applicability when large functional or image datasets are considered. This paper proposes an exact fast method to speed up the band depth computation when bands are defined by two curves. Remarkable computational gains are demonstrated through simulation studies comparing our proposal with the original computation and one existing approximate method. For example, we report an experiment where our method can rank one million curves, evaluated at fifty time points each, in 12.4 seconds with Matlab.

  7. Computer animations of color markings reveal the function of visual threat signals in Neolamprologus pulcher.

    Balzarini, Valentina; Taborsky, Michael; Villa, Fabienne; Frommen, Joachim G

    2017-02-01

    Visual signals, including changes in coloration and color patterns, are frequently used by animals to convey information. During contests, body coloration and its changes can be used to assess an opponent's state or motivation. Communication of aggressive propensity is particularly important in group-living animals with a stable dominance hierarchy, as the outcome of aggressive interactions determines the social rank of group members. Neolamprologus pulcher is a cooperatively breeding cichlid showing frequent within-group aggression. Both sexes exhibit two vertical black stripes on the operculum that vary naturally in shape and darkness. During frontal threat displays these patterns are actively exposed to the opponent, suggesting a signaling function. To investigate the role of operculum stripes during contests we manipulated their darkness in computer animated pictures of the fish. We recorded the responses in behavior and stripe darkness of test subjects to which these animated pictures were presented. Individuals with initially darker stripes were more aggressive against the animations and showed more operculum threat displays. Operculum stripes of test subjects became darker after exposure to an animation exhibiting a pale operculum than after exposure to a dark operculum animation, highlighting the role of the darkness of this color pattern in opponent assessment. We conclude that (i) the black stripes on the operculum of N. pulcher are a reliable signal of aggression and dominance, (ii) these markings play an important role in opponent assessment, and (iii) 2D computer animations are well suited to elicit biologically meaningful short-term aggressive responses in this widely used model system of social evolution.

  8. Computational engineering of cellulase Cel9A-68 functional motions through mutations in its linker region.

    Costa, M G S; Silva, Y F; Batista, P R

    2018-03-14

    Microbial cellulosic degradation by cellulases has become a complementary approach for biofuel production. However, its efficiency is hindered by the recalcitrance of cellulose fibres. In this context, computational protein design methods may offer an efficient way to obtain variants with improved enzymatic activity. Cel9A-68 is a cellulase from Thermobifida fusca that is still active at high temperatures. In a previous work, we described a collective bending motion, which governs the overall cellulase dynamics. This movement promotes the approximation of its CBM and CD structural domains (that are connected by a flexible linker). We have identified two residues (G460 and P461) located at the linker that act as a hinge point. Herein, we applied a new level of protein design, focusing on the modulation of this collective motion to obtain cellulase variants with enhanced functional dynamics. We probed whether specific linker mutations would affect Cel9A-68 dynamics through computational simulations. We assumed that P461G and G460+ (with an extra glycine) constructs would present enhanced interdomain motions, while the G460P mutant would be rigid. From our results, the P461G mutation resulted in a broader exploration of the conformational space, as confirmed by clustering and free energy analyses. The WT enzyme was the most rigid system. However, G460P and P460+ explored distinct conformational states described by opposite directions of low-frequency normal modes; they sampled preferentially closed and open conformations, respectively. Overall, we highlight two significant findings: (i) all mutants explored larger conformational spaces than the WT; (ii) the selection of distinct conformational populations was intimately associated with the mutation considered. Thus, the engineering of Cel9A-68 motions through linker mutations may constitute an efficient way to improve cellulase activity, facilitating the disruption of cellulose fibres.

  9. On The Effectiveness Of And Preference For Punishment And Extinction Components Of Function-Based Interventions

    Hanley, Gregory P; Piazza, Cathleen C; Fisher, Wayne W; Maglieri, Kristen A

    2005-01-01

    The current study describes an assessment sequence that may be used to identify individualized, effective, and preferred interventions for severe problem behavior in lieu of relying on a restricted set of treatment options that are assumed to be in the best interest of consumers. The relative effectiveness of functional communication training (FCT) with and without a punishment component was evaluated with 2 children for whom functional analyses demonstrated behavioral maintenance via social ...

  10. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues—active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM50 and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  11. A computational approach to discovering the functions of bacterial phytochromes by analysis of homolog distributions

    Lamparter Tilman

    2006-03-01

    bacterial phytochromes in ammonium assimilation and amino acid metabolism. Conclusion It was possible to identify several proteins that might share common functions with bacterial phytochromes by the co-distribution approach. This computational approach might also be helpful in other cases.

  12. Phasic firing in vasopressin cells: understanding its functional significance through computational models.

    Duncan J MacGregor

    Full Text Available Vasopressin neurons, responding to input generated by osmotic pressure, use an intrinsic mechanism to shift from slow irregular firing to a distinct phasic pattern, consisting of long bursts and silences lasting tens of seconds. With increased input, bursts lengthen, eventually shifting to continuous firing. The phasic activity remains asynchronous across the cells and is not reflected in the population output signal. Here we have used a computational vasopressin neuron model to investigate the functional significance of the phasic firing pattern. We generated a concise model of the synaptic input driven spike firing mechanism that gives a close quantitative match to vasopressin neuron spike activity recorded in vivo, tested against endogenous activity and experimental interventions. The integrate-and-fire based model provides a simple physiological explanation of the phasic firing mechanism involving an activity-dependent slow depolarising afterpotential (DAP generated by a calcium-inactivated potassium leak current. This is modulated by the slower, opposing, action of activity-dependent dendritic dynorphin release, which inactivates the DAP, the opposing effects generating successive periods of bursting and silence. Model cells are not spontaneously active, but fire when perturbed by random perturbations mimicking synaptic input. We constructed one population of such phasic neurons, and another population of similar cells but which lacked the ability to fire phasically. We then studied how these two populations differed in the way that they encoded changes in afferent inputs. By comparison with the non-phasic population, the phasic population responds linearly to increases in tonic synaptic input. Non-phasic cells respond to transient elevations in synaptic input in a way that strongly depends on background activity levels, phasic cells in a way that is independent of background levels, and show a similar strong linearization of the response

  13. Functional analysis of metabolic channeling and regulation in lignin biosynthesis: a computational approach.

    Yun Lee

    Full Text Available Lignin is a polymer in secondary cell walls of plants that is known to have negative impacts on forage digestibility, pulping efficiency, and sugar release from cellulosic biomass. While targeted modifications of different lignin biosynthetic enzymes have permitted the generation of transgenic plants with desirable traits, such as improved digestibility or reduced recalcitrance to saccharification, some of the engineered plants exhibit monomer compositions that are clearly at odds with the expected outcomes when the biosynthetic pathway is perturbed. In Medicago, such discrepancies were partly reconciled by the recent finding that certain biosynthetic enzymes may be spatially organized into two independent channels for the synthesis of guaiacyl (G and syringyl (S lignin monomers. Nevertheless, the mechanistic details, as well as the biological function of these interactions, remain unclear. To decipher the working principles of this and similar control mechanisms, we propose and employ here a novel computational approach that permits an expedient and exhaustive assessment of hundreds of minimal designs that could arise in vivo. Interestingly, this comparative analysis not only helps distinguish two most parsimonious mechanisms of crosstalk between the two channels by formulating a targeted and readily testable hypothesis, but also suggests that the G lignin-specific channel is more important for proper functioning than the S lignin-specific channel. While the proposed strategy of analysis in this article is tightly focused on lignin synthesis, it is likely to be of similar utility in extracting unbiased information in a variety of situations, where the spatial organization of molecular components is critical for coordinating the flow of cellular information, and where initially various control designs seem equally valid.

  14. Functional safeguards for computers for protection systems for Savannah River reactors

    Kritz, W.R.

    1977-06-01

    Reactors at the Savannah River Plant have recently been equipped with a ''safety computer'' system. This system utilizes dual digital computers in a primary protection system that monitors individual fuel assembly coolant flow and temperature. The design basis for the (SRP safety) computer systems allowed for eventual failure of any input sensor or any computer component. These systems are routinely used by reactor operators with a minimum of training in computer technology. The hardware configuration and software design therefore contain safeguards so that both hardware and human failures do not cause significant loss of reactor protection. The performance of the system to date is described

  15. Practical Steps toward Computational Unification: Helpful Perspectives for New Systems, Adding Functionality to Existing Ones

    Troy, R. M.

    2005-12-01

    and functions may be integrated into a system efficiently, with minimal effort, and with an eye toward an eventual Computational Unification of the Earth Sciences. A fundamental to such systems is meta-data which describe not only the content of data but also how intricate relationships are represented and used to good advantage. Retrieval techniques will be discussed including trade-offs in using externally managed meta-data versus embedded meta-data, how the two may be integrated, and how "simplifying assumptions" may or may not actually be helpful. The perspectives presented in this talk or poster session are based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, which sought to unify NASA's Mission To Planet Earth's EOS-DIS, and on-going experience developed by Science Tools corporation, of which the author is a principal. NOTE: These ideas are most easily shared in the form of a talk, and we suspect that this session will generate a lot of interest. We would therefore prefer to have this session accepted as a talk as opposed to a poster session.

  16. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  17. Revealing Soil Structure and Functional Macroporosity along a Clay Gradient Using X-ray Computed Tomography

    Naveed, Muhammad; Møldrup, Per; Arthur, Emmanuel

    2013-01-01

    clay content, respectively) at a field site in Lerbjerg, Denmark. The water-holding capacity of soils markedly increased with increasing soil clay content, while significantly higher air permeability was observed for the L1 to L3 soils than for the L4 to L6 soils. Higher air permeability values......The influence of clay content in soil-pore structure development and the relative importance of macroporosity in governing convective fluid flow are two key challenges toward better understanding and quantifying soil ecosystem functions. In this study, soil physical measurements (soil-water...... retention and air permeability) and x-ray computed tomography (CT) scanning were combined and used from two scales on intact soil columns (100 and 580 cm3). The columns were sampled along a natural clay gradient at six locations (L1, L2, L3, L4, L5 and L6 with 0.11, 0.16, 0.21, 0.32, 0.38 and 0.46 kg kg−1...

  18. Contrast computation methods for interferometric measurement of sensor modulation transfer function

    Battula, Tharun; Georgiev, Todor; Gille, Jennifer; Goma, Sergio

    2018-01-01

    Accurate measurement of image-sensor frequency response over a wide range of spatial frequencies is very important for analyzing pixel array characteristics, such as modulation transfer function (MTF), crosstalk, and active pixel shape. Such analysis is especially significant in computational photography for the purposes of deconvolution, multi-image superresolution, and improved light-field capture. We use a lensless interferometric setup that produces high-quality fringes for measuring MTF over a wide range of frequencies (here, 37 to 434 line pairs per mm). We discuss the theoretical framework, involving Michelson and Fourier contrast measurement of the MTF, addressing phase alignment problems using a moiré pattern. We solidify the definition of Fourier contrast mathematically and compare it to Michelson contrast. Our interferometric measurement method shows high detail in the MTF, especially at high frequencies (above Nyquist frequency). We are able to estimate active pixel size and pixel pitch from measurements. We compare both simulation and experimental MTF results to a lens-free slanted-edge implementation using commercial software.

  19. Computer-mediated communication preferences predict biobehavioral measures of social-emotional functioning.

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis-Tiwary, Tracy A

    2016-12-01

    The use of computer-mediated communication (CMC) as a form of social interaction has become increasingly prevalent, yet few studies examine individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N = 91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation (the late positive potential (LPP)). Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease, or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC.

  20. Computational Modeling and Theoretical Calculations on the Interactions between Spermidine and Functional Monomer (Methacrylic Acid in a Molecularly Imprinted Polymer

    Yujie Huang

    2015-01-01

    Full Text Available This paper theoretically investigates interactions between a template and functional monomer required for synthesizing an efficient molecularly imprinted polymer (MIP. We employed density functional theory (DFT to compute geometry, single-point energy, and binding energy (ΔE of an MIP system, where spermidine (SPD and methacrylic acid (MAA were selected as template and functional monomer, respectively. The geometry was calculated by using B3LYP method with 6-31+(d basis set. Furthermore, 6-311++(d, p basis set was used to compute the single-point energy of the above geometry. The optimized geometries at different template to functional monomer molar ratios, mode of bonding between template and functional monomer, changes in charge on natural bond orbital (NBO, and binding energy were analyzed. The simulation results show that SPD and MAA form a stable complex via hydrogen bonding. At 1 : 5 SPD to MAA ratio, the binding energy is minimum, while the amount of transferred charge between the molecules is maximum; SPD and MAA form a stable complex at 1 : 5 molar ratio through six hydrogen bonds. Optimizing structure of template-functional monomer complex, through computational modeling prior synthesis, significantly contributes towards choosing a suitable pair of template-functional monomer that yields an efficient MIP with high specificity and selectivity.

  1. Studies on the zeros of Bessel functions and methods for their computation: 2. Monotonicity, convexity, concavity, and other properties

    Kerimov, M. K.

    2016-07-01

    This work continues the study of real zeros of first- and second-kind Bessel functions and Bessel general functions with real variables and orders begun in the first part of this paper (see M.K. Kerimov, Comput. Math. Math. Phys. 54 (9), 1337-1388 (2014)). Some new results concerning such zeros are described and analyzed. Special attention is given to the monotonicity, convexity, and concavity of zeros with respect to their ranks and other parameters.

  2. Functional needs which led to the use of digital computing devices in the protection system of 1300 MW units

    Dalle, H.

    1986-01-01

    After a review of classical protection functions used in 900 MW power plants, it is concluded that in order to have functioning margins it is useful to calculate more finely the controled parameters. These calculating needs lead to the use of digital computing devices. Drawing profit from the new possibilities one can improve the general performances of the protection system with regard to availability, safety and maintenance. These options in the case of PALUEL led to the realization of SPIN, described here

  3. Structure of BRS-invariant local functionals

    Brandt, F.

    1993-01-01

    For a large class of gauge theories a nilpotent BRS-operator s is constructed and its cohomology in the space of local functionals of the off-shell fields is shown to be isomorphic to the cohomology of s=s+d on functions f(C,T) of tensor fields T and of variables C which are constructed of the ghosts and the connection forms. The result allows general statements about the structure of invariant classical actions and anomaly cadidates whose BRS-variation vanishes off-shell. The assumptions under which the result holds are thoroughly discussed. (orig.)

  4. Computation of the modified Bessel function of the third kind of imaginary orders: uniform Airy-type asymptotic expansion

    A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2002-01-01

    textabstractThe use of a uniform Airy-type asymptotic expansion for the computation of the modified Bessel functions of the third kind of imaginary orders ($K_{ia}(x)$) near the transition point $x=a$, is discussed. In [2], an algorithm for the evaluation of $K_{ia}(x)$ was presented, which made use

  5. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  6. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  7. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  8. Density functional theory based screening of ternary alkali-transition metal borohydrides: A computational material design project

    Hummelshøj, Jens Strabo; Landis, David; Voss, Johannes

    2009-01-01

    We present a computational screening study of ternary metal borohydrides for reversible hydrogen storage based on density functional theory. We investigate the stability and decomposition of alloys containing 1 alkali metal atom, Li, Na, or K (M1); and 1 alkali, alkaline earth or 3d/4d transition...

  9. Complex functionality with minimal computation: Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-01

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.

  10. Single-photon emission computed tomography for the assessment of ventricular perfusion and function

    Gonzalez, Patricio; Dussaillant, Gaston; Gutierrez, Daniela; Berrocal, Isabel; Alay, Rita; Otarola, Sonia

    2013-01-01

    Background: Single-photon emission computed tomography (SPECT) can be used as a non-invasive tool for the assessment of coronary perfusion. Aim: To assess ventricular perfusion and function by SPECT in patients with single vessel coronary artery disease. Material and Methods: Among patients with indications for a coronary artery angiography, those with significant lesions in one vessel, were selected for the study. Within 24 hours, cardiac SPECT examinations on basal conditions and after high doses of dipyridamole, were performed. SPECT data from 38 patients with a low probability of coronary artery disease was used for comparisons. Results:Ten patients aged 61 ± 8 years (seven men) were studied. Visual analysis of SPECT revealed signs suggestive of ischemia in eight patients. The remaining two patients did not have perfusion disturbances. SPECT detected eight of ten abnormal vessels reported in the coronary artery angiography. There were two false negative results Summed stress, summed rest and summed difference scores were 9.78 ± 6.51, 3.22 ± 5.07 and 6.33 ± 4.97, respectively. The ejection fractions under stress and at rest were 53 ± 11.7% and 61 ± 15.7% respectively (p ≤ 0.01). The figures for the control group were 69.1 ± 13.5% and 75.2 ± 12.04% respectively (significantly different from patients). Two patients had a summed motion score above 14.9. Likewise, two patients had a summed thickening score above 10.9. Conclusions: SPECT detected 80% of coronary lesions found during coronary artery angiography. Visual analysis of perfusion is highly reliable for diagnosis. Quantitative parameters must be considered only as reference parameters

  11. WE-FG-207B-02: Material Reconstruction for Spectral Computed Tomography with Detector Response Function

    Liu, J; Gao, H

    2016-01-01

    Purpose: Different from the conventional computed tomography (CT), spectral CT based on energy-resolved photon-counting detectors is able to provide the unprecedented material composition. However, an important missing piece for accurate spectral CT is to incorporate the detector response function (DRF), which is distorted by factors such as pulse pileup and charge-sharing. In this work, we propose material reconstruction methods for spectral CT with DRF. Methods: The polyenergetic X-ray forward model takes the DRF into account for accurate material reconstruction. Two image reconstruction methods are proposed: a direct method based on the nonlinear data fidelity from DRF-based forward model; a linear-data-fidelity based method that relies on the spectral rebinning so that the corresponding DRF matrix is invertible. Then the image reconstruction problem is regularized with the isotropic TV term and solved by alternating direction method of multipliers. Results: The simulation results suggest that the proposed methods provided more accurate material compositions than the standard method without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Conclusion: We have proposed material reconstruction methods for spectral CT with DRF, whichprovided more accurate material compositions than the standard methods without DRF. Moreover, the proposed method with linear data fidelity had improved reconstruction quality from the proposed method with nonlinear data fidelity. Jiulong Liu and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).

  12. Nephrocalcinosis in rabbits - correlation of ultrasound, computed tomography, pathology and renal function

    Cramer, B.; Pushpanathan, C.

    1998-01-01

    Objective. The purpose of this study was to induce nephrocalcinosis (NC) in rabbits with phosphate, vitamin D, oxalate and furosemide, to determine the effect of renal function and to correlate detection of ultrasound (US) and computed tomography (CT) with pathology. Materials and methods. Seventy-five immature New Zealand white rabbits were divided into five groups of 15. In each group, 5 animals were controls and 10 were given oral phosphate, furosemide, vitamin D or oxalate, furosemide, vitamin D or oxalate. Unilateral nephrectomy was performed at 3-6 weeks, and 5 rabbits of each test group were withdrawn from the substance. Weekly US was performed as well as US, CT and measurement of serum creatinine at the time of nephrectomy and prior to planned demise. Results. A todal of 140 kidneys in 75 rabbits had both pathological and US correlation, with CT correlation in 126. Forty rabbits developed nephrocalcinosis with early (post nephrectomy at 3-6 weeks) or late (post demise at 10-20 weeks) phatological correlation obtained in 53 kidneys. Forty-one of these kidneys were from test animals: 23 developed NC early, 18 late. Twelve controls developed NC; 4 early, 8 late. Comparing US and CT to phatology, the sensitivity was 96% for US, 64% for CT. Specificity was 85% for US and 96% for CT. In 109 kidneys, information on serum creatinine level was available to correlate with phatology. The mean creatinine level was 138 mmol/l for those with NC and 118 mmol/l for those without NC (P<0.001)

  13. Nephrocalcinosis in rabbits - correlation of ultrasound, computed tomography, pathology and renal function

    Cramer, B.; Pushpanathan, C. [Janeway Child Health Centre, St. Johns`s (Canada). Radiology Dept.; Husa, L. [Memorial Univ. of Newfoundland, St. Johns`s (Canada)

    1998-01-01

    Objective. The purpose of this study was to induce nephrocalcinosis (NC) in rabbits with phosphate, vitamin D, oxalate and furosemide, to determine the effect of renal function and to correlate detection of ultrasound (US) and computed tomography (CT) with pathology. Materials and methods. Seventy-five immature New Zealand white rabbits were divided into five groups of 15. In each group, 5 animals were controls and 10 were given oral phosphate, furosemide, vitamin D or oxalate, furosemide, vitamin D or oxalate. Unilateral nephrectomy was performed at 3-6 weeks, and 5 rabbits of each test group were withdrawn from the substance. Weekly US was performed as well as US, CT and measurement of serum creatinine at the time of nephrectomy and prior to planned demise. Results. A todal of 140 kidneys in 75 rabbits had both pathological and US correlation, with CT correlation in 126. Forty rabbits developed nephrocalcinosis with early (post nephrectomy at 3-6 weeks) or late (post demise at 10-20 weeks) phatological correlation obtained in 53 kidneys. Forty-one of these kidneys were from test animals: 23 developed NC early, 18 late. Twelve controls developed NC; 4 early, 8 late. Comparing US and CT to phatology, the sensitivity was 96% for US, 64% for CT. Specificity was 85% for US and 96% for CT. In 109 kidneys, information on serum creatinine level was available to correlate with phatology. The mean creatinine level was 138 mmol/l for those with NC and 118 mmol/l for those without NC (P<0.001).

  14. Multidetector computed tomography predictors of late ventricular remodeling and function after acute myocardial infarction

    Lessick, Jonathan; Abadi, Sobhi; Agmon, Yoram; Keidar, Zohar; Carasso, Shemi; Aronson, Doron; Ghersin, Eduard; Rispler, Shmuel; Sebbag, Anat; Israel, Ora; Hammerman, Haim; Roguin, Ariel

    2012-01-01

    Background: Despite advent of rapid arterial revascularization as 1st line treatment for acute myocardial infarction (AMI), incomplete restoral of flow at the microvascular level remains a problem and is associated with adverse prognosis, including pathological ventricular remodeling. We aimed to study the association between multidetector row computed tomography (MDCT) perfusion defects and ventricular remodeling post-AMI. Methods: In a prospective study, 20 patients with ST-elevation AMI, treated by primary angioplasty, underwent arterial and late phase MDCT as well as radionuclide scans to study presence, size and severity of myocardial perfusion defects. Contrast echocardiography was performed at baseline and at 4 months follow-up to evaluate changes in myocardial function and remodeling. Results: Early defects (ED), late defects (LD) and late enhancement (LE) were detected in 15, 7 and 16 patients, respectively and radionuclide defects in 15 patients. The ED area (r = 0.74), and LD area (r = 0.72), and to a lesser extent LE area (r = 0.62) correlated moderately well with SPECT summed rest score. By univariate analysis, follow-up end-systolic volume index and ejection fraction were both significantly related to ED and LD size and severity, but not to LE size or severity. By multivariate analysis, end-systolic volume index was best predicted by LD area (p < 0.05) and ejection fraction by LD enhancement ratio. Conclusions: LD size and severity on MDCT are most closely associated with pathological ventricular remodeling after AMI and may thus play a role in early identification and treatment of this condition

  15. Assessment of left ventricular function and mass in dual-source computed tomography coronary angiography

    Jensen, Christoph J., E-mail: c.jensen@contilia.d [Department of Cardiology and Angiology, Elisabeth Hospital, Essen (Germany); Jochims, Markus [Department of Cardiology and Angiology, Elisabeth Hospital, Essen (Germany); Hunold, Peter; Forsting, Michael; Barkhausen, Joerg [Department of Diagnostic and Interventional Radiology and Neuroradiology, University of Essen (Germany); Sabin, Georg V.; Bruder, Oliver [Department of Cardiology and Angiology, Elisabeth Hospital, Essen (Germany); Schlosser, Thomas [Department of Diagnostic and Interventional Radiology and Neuroradiology, University of Essen (Germany)

    2010-06-15

    Purpose: To quantify left ventricular (LV) function and mass (LVM) derived from dual-source computed tomography (DSCT) and the influence of beta-blocker administration compared to cardiac magnetic resonance imaging (CMR). Methods: Thirty-two patients undergoing cardiac DSCT and CMR were included, where of fifteen received metoprolol intravenously before DSCT. LV parameters were calculated by the disc-summation method (DSM) and by a segmented region-growing algorithm (RGA). All data sets were analyzed by two blinded observers. Interobserver agreement was tested by the intraclass correlation coefficient. Results.: 1. Using DSM LV parameters were not statistically different between DSCT and CMR in all patients (DSCT vs. CMR: EF 63 {+-} 8% vs. 64 {+-} 8%, p = 0.47; EDV 136 {+-} 36 ml vs. 138 {+-} 35 ml, p = 0.66; ESV 52 {+-} 21 ml vs. 52 {+-} 22 ml, p = 0.61; SV 83 {+-} 22 ml vs. 87 {+-} 19 ml, p = 0.22; CO 5.4 {+-} 0.9 l/min vs. 5.7 {+-} 1.2 l/min, p = 0.09, LVM 132 {+-} 33 g vs. 132 {+-} 33 g, p = 0.99). 2. In a subgroup of 15 patients beta-blockade prior to DSCT resulted in a lower ejection fraction (EF), stroke volume (SV), cardiac output (CO) and increase in end systolic volume (ESV) in DSCT (EF 59 {+-} 8% vs. 62 {+-} 9%; SV 73 {+-} 17 ml vs. 81 {+-} 15 ml; CO 5.7 {+-} 1.2 l/min vs. 5.0 {+-} 0.8 l/min; ESV 52 {+-} 27 ml vs. 57 {+-} 24 ml, all p < 0.05). 3. Analyzing the RGA parameters LV volumes were not significantly different compared to DSM, whereas LVM was higher using RGA (177 {+-} 31 g vs. 132 {+-} 33 g, p < 0.05). Interobserver agreement was excellent comparing DSM values with best agreement between RGA calculations. Conclusion: Left ventricular volumes and mass can reliably be assessed by DSCT compared to CMR. However, beta-blocker administration leads to statistically significant reduced EF, SV and CO, whereas ESV significantly increases. DSCT RGA reliably analyzes LV function, whereas LVM is overestimated compared to DSM.

  16. Assessment of left ventricular function and mass in dual-source computed tomography coronary angiography

    Jensen, Christoph J.; Jochims, Markus; Hunold, Peter; Forsting, Michael; Barkhausen, Joerg; Sabin, Georg V.; Bruder, Oliver; Schlosser, Thomas

    2010-01-01

    Purpose: To quantify left ventricular (LV) function and mass (LVM) derived from dual-source computed tomography (DSCT) and the influence of beta-blocker administration compared to cardiac magnetic resonance imaging (CMR). Methods: Thirty-two patients undergoing cardiac DSCT and CMR were included, where of fifteen received metoprolol intravenously before DSCT. LV parameters were calculated by the disc-summation method (DSM) and by a segmented region-growing algorithm (RGA). All data sets were analyzed by two blinded observers. Interobserver agreement was tested by the intraclass correlation coefficient. Results.: 1. Using DSM LV parameters were not statistically different between DSCT and CMR in all patients (DSCT vs. CMR: EF 63 ± 8% vs. 64 ± 8%, p = 0.47; EDV 136 ± 36 ml vs. 138 ± 35 ml, p = 0.66; ESV 52 ± 21 ml vs. 52 ± 22 ml, p = 0.61; SV 83 ± 22 ml vs. 87 ± 19 ml, p = 0.22; CO 5.4 ± 0.9 l/min vs. 5.7 ± 1.2 l/min, p = 0.09, LVM 132 ± 33 g vs. 132 ± 33 g, p = 0.99). 2. In a subgroup of 15 patients beta-blockade prior to DSCT resulted in a lower ejection fraction (EF), stroke volume (SV), cardiac output (CO) and increase in end systolic volume (ESV) in DSCT (EF 59 ± 8% vs. 62 ± 9%; SV 73 ± 17 ml vs. 81 ± 15 ml; CO 5.7 ± 1.2 l/min vs. 5.0 ± 0.8 l/min; ESV 52 ± 27 ml vs. 57 ± 24 ml, all p < 0.05). 3. Analyzing the RGA parameters LV volumes were not significantly different compared to DSM, whereas LVM was higher using RGA (177 ± 31 g vs. 132 ± 33 g, p < 0.05). Interobserver agreement was excellent comparing DSM values with best agreement between RGA calculations. Conclusion: Left ventricular volumes and mass can reliably be assessed by DSCT compared to CMR. However, beta-blocker administration leads to statistically significant reduced EF, SV and CO, whereas ESV significantly increases. DSCT RGA reliably analyzes LV function, whereas LVM is overestimated compared to DSM.

  17. Using multiple schedules during functional communication training to promote rapid transfer of treatment effects.

    Fisher, Wayne W; Greer, Brian D; Fuhrman, Ashley M; Querim, Angie C

    2015-12-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and therapists. With 2 children, we conducted FCT in the context of mixed (baseline) and multiple (treatment) schedules introduced across settings or therapists using a multiple baseline design. Results indicated that when the multiple schedules were introduced, the functional communication response came under rapid discriminative control, and problem behavior remained at near-zero rates. We extended these findings with another individual by using a more traditional baseline in which problem behavior produced reinforcement. Results replicated those of the previous participants and showed rapid reductions in problem behavior when multiple schedules were implemented across settings. © Society for the Experimental Analysis of Behavior.

  18. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  19. Evaluation of left ventricular function and volume with multidetector-row computed tomography. Comparison with electrocardiogram-gated single photon emission computed tomography

    Suzuki, Takeya; Yamashina, Shohei; Nanjou, Shuji; Yamazaki, Junichi

    2007-01-01

    This study compared left ventricular systolic function and volume determined by multidetector-row computed tomography (MDCT) and electrocardiogram-gated single photon emission computed tomography (G-SPECT) Thirty-seven patients with coronary artery disease and non-cardiovascular disease underwent MDCT. In this study, left ventricular ejection fraction (EF), left ventricular end-diastolic volume (EDV) and left ventricular end-systolic volume (ESV) were calculated using only two-phase imaging with MDCT. Left ventricular function and volume were compared using measurements from G-SPECT. We conducted MDCT and G-SPECT virtually simultaneously. Both the EF and ESV evaluated by MDCT closely correlated with G-SPECT (r=0.763, P 65 bpm) during MDCT significantly influenced the difference in EF calculated from MDCT and G-SPECT (P<0.05). Left ventricular function can be measured with MDCT as well as G-SPECT. However, a heart rate over 65 bpm during MDCT negatively affects the EF correlation between MDCT and G-SPECT. (author)

  20. Computing wave functions in multichannel collisions with non-local potentials using the R-matrix method

    Bonitati, Joey; Slimmer, Ben; Li, Weichuan; Potel, Gregory; Nunes, Filomena

    2017-09-01

    The calculable form of the R-matrix method has been previously shown to be a useful tool in approximately solving the Schrodinger equation in nuclear scattering problems. We use this technique combined with the Gauss quadrature for the Lagrange-mesh method to efficiently solve for the wave functions of projectile nuclei in low energy collisions (1-100 MeV) involving an arbitrary number of channels. We include the local Woods-Saxon potential, the non-local potential of Perey and Buck, a Coulomb potential, and a coupling potential to computationally solve for the wave function of two nuclei at short distances. Object oriented programming is used to increase modularity, and parallel programming techniques are introduced to reduce computation time. We conclude that the R-matrix method is an effective method to predict the wave functions of nuclei in scattering problems involving both multiple channels and non-local potentials. Michigan State University iCER ACRES REU.

  1. Computation of Green function of the Schroedinger-like partial differential equations by the numerical functional integration

    Lobanov, Yu.Yu.; Shahbagian, R.R.; Zhidkov, E.P.

    1991-01-01

    A new method for numerical solution of the boundary problem for Schroedinger-like partial differential equations in R n is elaborated. The method is based on representation of multidimensional Green function in the form of multiple functional integral and on the use of approximation formulas which are constructed for such integrals. The convergence of approximations to the exact value is proved, the remainder of the formulas is estimated. Method reduces the initial differential problem to quadratures. 16 refs.; 7 tabs

  2. A Dynamic Connectome Supports the Emergence of Stable Computational Function of Neural Circuits through Reward-Based Learning.

    Kappel, David; Legenstein, Robert; Habenschuss, Stefan; Hsieh, Michael; Maass, Wolfgang

    2018-01-01

    Synaptic connections between neurons in the brain are dynamic because of continuously ongoing spine dynamics, axonal sprouting, and other processes. In fact, it was recently shown that the spontaneous synapse-autonomous component of spine dynamics is at least as large as the component that depends on the history of pre- and postsynaptic neural activity. These data are inconsistent with common models for network plasticity and raise the following questions: how can neural circuits maintain a stable computational function in spite of these continuously ongoing processes, and what could be functional uses of these ongoing processes? Here, we present a rigorous theoretical framework for these seemingly stochastic spine dynamics and rewiring processes in the context of reward-based learning tasks. We show that spontaneous synapse-autonomous processes, in combination with reward signals such as dopamine, can explain the capability of networks of neurons in the brain to configure themselves for specific computational tasks, and to compensate automatically for later changes in the network or task. Furthermore, we show theoretically and through computer simulations that stable computational performance is compatible with continuously ongoing synapse-autonomous changes. After reaching good computational performance it causes primarily a slow drift of network architecture and dynamics in task-irrelevant dimensions, as observed for neural activity in motor cortex and other areas. On the more abstract level of reinforcement learning the resulting model gives rise to an understanding of reward-driven network plasticity as continuous sampling of network configurations.

  3. FIT: Computer Program that Interactively Determines Polynomial Equations for Data which are a Function of Two Independent Variables

    Arbuckle, P. D.; Sliwa, S. M.; Roy, M. L.; Tiffany, S. H.

    1985-01-01

    A computer program for interactively developing least-squares polynomial equations to fit user-supplied data is described. The program is characterized by the ability to compute the polynomial equations of a surface fit through data that are a function of two independent variables. The program utilizes the Langley Research Center graphics packages to display polynomial equation curves and data points, facilitating a qualitative evaluation of the effectiveness of the fit. An explanation of the fundamental principles and features of the program, as well as sample input and corresponding output, are included.

  4. Cardiovascular measurement and cardiac function analysis with electron beam computed tomography in health Chinese people (50 cases report)

    Lu Bin; Dai Ruping; Zhang Shaoxiong; Bai Hua; Jing Baolian; Cao Cheng; He Sha; Ren Li

    1998-01-01

    Purpose: To quantitatively measure cardiovascular diameters and function parameters by using electron beam computed tomography, EBCT. Methods: Men 50 health Chinese people accepted EBCT common transverse and short-axis enhanced movie scan (27 men, 23 women, average age 47.7 years.). The transverse scan was used to measure the diameters of the ascending aorta, descending aorta, pulmonary artery and left atrium. The movie study was used to measure the left ventricular myocardium thickness and analysis global, sectional and segmental function of the right and left ventricles. Results: The cardiovascular diameters and cardiac functional parameters were calculated. The diameters and most functional parameters (end syspoble volume, syspole volume, ejection fraction, cardiac-output, cardiac index) of normal Chinese men were greater than those of women (P>0.05). However, the EDV and MyM(myocardium mass) of both ventricles were significant (p<0.01). Conclusion: EBCT is a minimally invasive method for cardiovascular measurement and cardiac function evaluation

  5. Application node system image manager subsystem within a distributed function laboratory computer system

    Stubblefield, F.W.; Beck, R.D.

    1978-10-01

    A computer system to control and acquire data from one x-ray diffraction, five neutron scattering, and four neutron diffraction experiments located at the Brookhaven National Laboratory High Flux Beam Reactor has operated in a routine manner for over three years. The computer system is configured as a network of computer processors with the processor interconnections assuming a star-like structure. At the points of the star are the ten experiment control-data acquisition computers, referred to as application nodes. At the center of the star is a shared service node which supplies a set of shared services utilized by all of the application nodes. A program development node occupies one additional point of the star. The design and implementation of a network subsystem to support development and execution of operating systems for the application nodes is described. 6 figures, 1 table

  6. A Functional Correspondence between Monadic Evaluators and Abstract Machines for Languages with Computational Effects

    Ager, Mads Sig; Danvy, Olivier; Midtgaard, Jan

    2005-01-01

    We extend our correspondence between evaluators and abstract machines from the pure setting of the lambda-calculus to the impure setting of the computational lambda-calculus. We show how to derive new abstract machines from monadic evaluators for the computational lambda-calculus. Starting from (1......) a generic evaluator parameterized by a monad and (2) a monad specifying a computational effect, we inline the components of the monad in the generic evaluator to obtain an evaluator written in a style that is specific to this computational effect. We then derive the corresponding abstract machine by closure......-converting, CPS-transforming, and defunctionalizing this specific evaluator. We illustrate the construction first with the identity monad, obtaining the CEK machine, and then with a lifting monad, a state monad, and with a lifted state monad, obtaining variants of the CEK machine with error handling, state...

  7. Computational Methods for Large Spatio-temporal Datasets and Functional Data Ranking

    Huang, Huang

    2017-01-01

    that are both computationally and statistically efficient. We explore the improvement of the approximation theoretically and investigate the performance by simulations. For real applications, we analyze a soil moisture dataset with 2 million measurements

  8. Assessment of tumor vascularization with functional computed tomography perfusion imaging in patients with cirrhotic liver disease.

    Li, Jin-Ping; Zhao, De-Li; Jiang, Hui-Jie; Huang, Ya-Hua; Li, Da-Qing; Wan, Yong; Liu, Xin-Ding; Wang, Jin-E

    2011-02-01

    Hepatocellular carcinoma (HCC) is a common malignant tumor in China, and early diagnosis is critical for patient outcome. In patients with HCC, it is mostly based on liver cirrhosis, developing from benign regenerative nodules and dysplastic nodules to HCC lesions, and a better understanding of its vascular supply and the hemodynamic changes may lead to early tumor detection. Angiogenesis is essential for the growth of primary and metastatic tumors due to changes in vascular perfusion, blood volume and permeability. These hemodynamic and physiological properties can be measured serially using functional computed tomography perfusion (CTP) imaging and can be used to assess the growth of HCC. This study aimed to clarify the physiological characteristics of tumor angiogenesis in cirrhotic liver disease by this fast imaging method. CTP was performed in 30 volunteers without liver disease (control subjects) and 49 patients with liver disease (experimental subjects: 27 with HCC and 22 with cirrhosis). All subjects were also evaluated by physical examination, laboratory screening and Doppler ultrasonography of the liver. The diagnosis of HCC was made according to the EASL criteria. All patients underwent contrast-enhanced ultrasonography, pre- and post-contrast triple-phase CT and CTP study. A mathematical deconvolution model was applied to provide hepatic blood flow (HBF), hepatic blood volume (HBV), mean transit time (MTT), permeability of capillary vessel surface (PS), hepatic arterial index (HAI), hepatic arterial perfusion (HAP) and hepatic portal perfusion (HPP) data. The Mann-Whitney U test was used to determine differences in perfusion parameters between the background cirrhotic liver parenchyma and HCC and between the cirrhotic liver parenchyma with HCC and that without HCC. In normal liver, the HAP/HVP ratio was about 1/4. HCC had significantly higher HAP and HAI and lower HPP than background liver parenchyma adjacent to the HCC. The value of HBF at the tumor

  9. Density functionalized [RuII(NO)(Salen)(Cl)] complex: Computational photodynamics and in vitro anticancer facets.

    Mir, Jan Mohammad; Jain, N; Jaget, P S; Maurya, R C

    2017-09-01

    Photodynamic therapy (PDT) is a treatment that uses photosensitizing agents to kill cancer cells. Scientific community has been eager for decades to design an efficient PDT drug. Under such purview, the current report deals with the computational photodynamic behavior of ruthenium(II) nitrosyl complex containing N, N'-salicyldehyde-ethylenediimine (SalenH 2 ), the synthesis and X-ray crystallography of which is already known [Ref. 38,39]. Gaussian 09W software package was employed to carry out the density functional (DFT) studies. DFT calculations with Becke-3-Lee-Yang-Parr (B3LYP)/Los Alamos National Laboratory 2 Double Z (LanL2DZ) specified for Ru atom and B3LYP/6-31G(d,p) combination for all other atoms were used using effective core potential method. Both, the ground and excited states of the complex were evolved. Some known photosensitizers were compared with the target complex. Pthalocyanine and porphyrin derivatives were the compounds selected for the respective comparative study. It is suggested that effective photoactivity was found due to the presence of ruthenium core in the model complex. In addition to the evaluation of theoretical aspects in vitro anticancer aspects against COLO-205 human cancer cells have also been carried out with regard to the complex. More emphasis was laid to extrapolate DFT to depict the chemical power of the target compound to release nitric oxide. A promising visible light triggered nitric oxide releasing power of the compound has been inferred. In vitro antiproliferative studies of [RuCl 3 (PPh 3 ) 3 ] and [Ru(NO)(Salen)(Cl)] have revealed the model complex as an excellent anticancer agent. From IC 50 values of 40.031mg/mL in former and of 9.74mg/mL in latter, it is established that latter bears more anticancer potentiality. From overall study the DFT based structural elucidation and the efficiency of NO, Ru and Salen co-ligands has shown promising drug delivery property and a good candidacy for both chemotherapy as well as

  10. Motion estimation for cardiac functional analysis using two x-ray computed tomography scans.

    Fung, George S K; Ciuffo, Luisa; Ashikaga, Hiroshi; Taguchi, Katsuyuki

    2017-09-01

    This work concerns computed tomography (CT)-based cardiac functional analysis (CFA) with a reduced radiation dose. As CT-CFA requires images over the entire heartbeat, the scans are often performed at 10-20% of the tube current settings that are typically used for coronary CT angiography. A large image noise then degrades the accuracy of motion estimation. Moreover, even if the scan was performed during the sinus rhythm, the cardiac motion observed in CT images may not be cyclic with patients with atrial fibrillation. In this study, we propose to use two CT scan data, one for CT angiography at a quiescent phase at a standard dose and the other for CFA over the entire heart beat at a lower dose. We have made the following four modifications to an image-based cardiac motion estimation method we have previously developed for a full-dose retrospectively gated coronary CT angiography: (a) a full-dose prospectively gated coronary CT angiography image acquired at the least motion phase was used as the reference image; (b) a three-dimensional median filter was applied to lower-dose retrospectively gated cardiac images acquired at 20 phases over one heartbeat in order to reduce image noise; (c) the strength of the temporal regularization term was made adaptive; and (d) a one-dimensional temporal filter was applied to the estimated motion vector field in order to decrease jaggy motion patterns. We describe the conventional method iME1 and the proposed method iME2 in this article. Five observers assessed the accuracy of the estimated motion vector field of iME2 and iME1 using a 4-point scale. The observers repeated the assessment with data presented in a new random order 1 week after the first assessment session. The study confirmed that the proposed iME2 was robust against the mismatch of noise levels, contrast enhancement levels, and shapes of the chambers. There was a statistically significant difference between iME2 and iME1 (accuracy score, 2.08 ± 0.81 versus 2.77

  11. Correlation of chest computed tomography findings with dyspnea and lung functions in post-tubercular sequelae

    Ananya Panda

    2016-01-01

    Full Text Available Aims: To study the correlation between dyspnea, radiological findings, and pulmonary function tests (PFTs in patients with sequelae of pulmonary tuberculosis (TB. Materials and Methods: Clinical history, chest computed tomography (CT, and PFT of patients with post-TB sequelae were recorded. Dyspnea was graded according to the Modified Medical Research Council (mMRC scale. CT scans were analyzed for fibrosis, cavitation, bronchiectasis, consolidation, nodules, and aspergilloma. Semi-quantitative analysis was done for these abnormalities. Scores were added to obtain a total morphological score (TMS. The lungs were also divided into three zones and scores added to obtain the total lung score (TLS. Spirometry was done for forced vital capacity (FVC, forced expiratory volume in 1 s (FEV1, and FEV1/FVC. Results: Dyspnea was present in 58/101 patients. A total of 22/58 patients had mMRC Grade 1, and 17/58 patients had Grades 2 and 3 dyspnea each. There was a significant difference in median fibrosis, bronchiectasis, nodules (P < 0.01 scores, TMS, and TLS (P < 0.0001 between dyspnea and nondyspnea groups. Significant correlations were obtained between grades of dyspnea and fibrosis (r = 0.34, P = 0.006, bronchiectasis (r = 0.35, P = 0.004, nodule (r = 0.24, P = 0.016 scores, TMS (r = 0.398, P = 0.000, and TLS (r = 0.35, P = 0.0003. PFTs were impaired in 78/101 (77.2% patients. Restrictive defect was most common in 39.6% followed by mixed in 34.7%. There was a negative but statistically insignificant trend between PFT and fibrosis, bronchiectasis, nodule scores, TMS, and TLS. However, there were significant differences in median fibrosis, cavitation, and bronchiectasis scores in patients with normal, mild to moderate, and severe respiratory defects. No difference was seen in TMS and TLS according to the severity of the respiratory defect. Conclusion: Both fibrosis and bronchiectasis correlated with dyspnea and with PFT. However, this correlation was not

  12. Stream function method for computing steady rotational transonic flows with application to solar wind-type problems

    Kopriva, D.A.

    1982-01-01

    A numerical scheme has been developed to solve the quasilinear form of the transonic stream function equation. The method is applied to compute steady two-dimensional axisymmetric solar wind-type problems. A single, perfect, non-dissipative, homentropic and polytropic gas-dynamics is assumed. The four equations governing mass and momentum conservation are reduced to a single nonlinear second order partial differential equation for the stream function. Bernoulli's equation is used to obtain a nonlinear algebraic relation for the density in terms of stream function derivatives. The vorticity includes the effects of azimuthal rotation and Bernoulli's function and is determined from quantities specified on boundaries. The approach is efficient. The number of equations and independent variables has been reduced and a rapid relaxation technique developed for the transonic full potential equation is used. Second order accurate central differences are used in elliptic regions. In hyperbolic regions a dissipation term motivated by the rotated differencing scheme of Jameson is added for stability. A successive-line-overrelaxation technique also introduced by Jameson is used to solve the equations. The nonlinear equation for the density is a double valued function of the stream function derivatives. The velocities are extrapolated from upwind points to determine the proper branch and Newton's method is used to iteratively compute the density. This allows accurate solutions with few grid points

  13. Assessment of left ventricular function by electrocardiogram-gated myocardial single photon emission computed tomography using quantitative gated single photon emission computed tomography software

    Morita, Koichi; Adachi, Itaru; Konno, Masanori

    1999-01-01

    Electrocardiogram (ECG)-gated myocardial single photon emission computed tomography (SPECT) can assess left ventricular (LV) perfusion and function easily using quantitative gated SPECT (QGS) software. ECG-gated SPECT was performed in 44 patients with coronary artery disease under post-stress and resting conditions to assess the values of LV functional parameters, by comparison to LV ejection fraction derived from gated blood pool scan and myocardial characteristics. A good correlation was obtained between ejection fraction using QGS and that using cardiac blood pool scan (r=0.812). Some patients with myocardial ischemia had lower ejection fraction under post-stress compared to resting conditions, indicating post-stress LV dysfunction. LV wall motion and wall thickening were significantly impaired in ischemic and infarcted myocardium, and the degree of abnormality in the infarcted areas was greater than in the ischemia area. LV functional parameters derived using QGS were useful to assess post-stress LV dysfunction and myocardial viability. In conclusion, ECG-gated myocardial SPECT permits simultaneous quantitative assessment of myocardial perfusion and function. (author)

  14. Application of modified analytical function for approximation and computer simulation of diffraction profile

    Marrero, S. I.; Turibus, S. N.; Assis, J. T. De; Monin, V. I.

    2011-01-01

    Data processing of the most of diffraction experiments is based on determination of diffraction line position and measurement of broadening of diffraction profile. High precision and digitalisation of these procedures can be resolved by approximation of experimental diffraction profiles by analytical functions. There are various functions for these purposes both simples, like Gauss function, but no suitable for wild range of experimental profiles and good approximating functions but complicated for practice using, like Vougt or PersonVII functions. Proposed analytical function is modified Cauchy function which uses two variable parameters allowing describing any experimental diffraction profile. In the presented paper modified function was applied for approximation of diffraction lines of steels after various physical and mechanical treatments and simulation of diffraction profiles applied for study of stress gradients and distortions of crystal structure. (Author)

  15. Development of an item bank and computer adaptive test for role functioning

    Anatchkova, Milena D; Rose, Matthias; Ware, John E

    2012-01-01

    Role functioning (RF) is a key component of health and well-being and an important outcome in health research. The aim of this study was to develop an item bank to measure impact of health on role functioning.......Role functioning (RF) is a key component of health and well-being and an important outcome in health research. The aim of this study was to develop an item bank to measure impact of health on role functioning....

  16. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  17. A Karaoke System with Real-Time Media Merging and Sharing Functions for a Cloud-Computing-Integrated Mobile Device

    Her-Tyan Yeh

    2013-01-01

    Full Text Available Mobile devices such as personal digital assistants (PDAs, smartphones, and tablets have increased in popularity and are extremely efficient for work-related, social, and entertainment uses. Popular entertainment services have also attracted substantial attention. Thus, relevant industries have exerted considerable efforts in establishing a method by which mobile devices can be used to develop excellent and convenient entertainment services. Because cloud-computing technology is mature and possesses a strong computing processing capacity, integrating this technology into the entertainment service function in mobile devices can reduce the data load on a system and maintain mobile device performances. This study combines cloud computing with a mobile device to design a karaoke system that contains real-time media merging and sharing functions. This system enables users to download music videos (MVs from their mobile device and sing and record their singing by using the device. They can upload the recorded song to the cloud server where it is merged with real-time media. Subsequently, by employing a media streaming technology, users can store their personal MVs in their mobile device or computer and instantaneously share these videos with others on the Internet. Through this process, people can instantly watch shared videos, enjoy the leisure and entertainment effects of mobile devices, and satisfy their desire for singing.

  18. Fast Computation of Solvation Free Energies with Molecular Density Functional Theory: Thermodynamic-Ensemble Partial Molar Volume Corrections.

    Sergiievskyi, Volodymyr P; Jeanmairet, Guillaume; Levesque, Maximilien; Borgis, Daniel

    2014-06-05

    Molecular density functional theory (MDFT) offers an efficient implicit-solvent method to estimate molecule solvation free-energies, whereas conserving a fully molecular representation of the solvent. Even within a second-order approximation for the free-energy functional, the so-called homogeneous reference fluid approximation, we show that the hydration free-energies computed for a data set of 500 organic compounds are of similar quality as those obtained from molecular dynamics free-energy perturbation simulations, with a computer cost reduced by 2-3 orders of magnitude. This requires to introduce the proper partial volume correction to transform the results from the grand canonical to the isobaric-isotherm ensemble that is pertinent to experiments. We show that this correction can be extended to 3D-RISM calculations, giving a sound theoretical justification to empirical partial molar volume corrections that have been proposed recently.

  19. Symbolic computation of exact solutions expressible in rational formal hyperbolic and elliptic functions for nonlinear partial differential equations

    Wang Qi; Chen Yong

    2007-01-01

    With the aid of symbolic computation, some algorithms are presented for the rational expansion methods, which lead to closed-form solutions of nonlinear partial differential equations (PDEs). The new algorithms are given to find exact rational formal polynomial solutions of PDEs in terms of Jacobi elliptic functions, solutions of the Riccati equation and solutions of the generalized Riccati equation. They can be implemented in symbolic computation system Maple. As applications of the methods, we choose some nonlinear PDEs to illustrate the methods. As a result, we not only can successfully obtain the solutions found by most existing Jacobi elliptic function methods and Tanh-methods, but also find other new and more general solutions at the same time

  20. Hypertensive disease and renal hypertensions: renal structural and functional studies by using dynamic computed tomography

    Arabidze, G.G.; Pogrebnaya, G.N.; Todua, F.I.; Sokolova, R.I.; Kozdoba, O.A.

    1989-01-01

    Dynamic computed tomography was conducted by the original methods; the findings were analyzed by taking into account time-density curves which made it possible to gain an insight into the status of blood flow and filtration in each individual kidney. Computed tomography and dynamic computed tomography revealed that hypertensive disease was characterized by normal volume and thickness of the renal cortical layer and symmetric time-density curves, whereas a hypertensive type of chronic glomerulonephritis featured lower renal cartical layer thickness, reduced renal volume, symmetrically decrease amplitudes of the first and second peaks of the time-density curve, chronic pyelonephritis showed asymmetric time-density diagrams due to the lower density areas in the afflicted kidney

  1. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    In this report is described the work effort to develop and demonstrate a software framework to support advanced process simulations to evaluate the performance of advanced power systems. Integrated into the framework are a broad range of models, analysis tools, and visualization methods that can be used for the plant evaluation. The framework provides a tightly integrated problem-solving environment, with plug-and-play functionality, and includes a hierarchy of models, ranging from fast running process models to detailed reacting CFD models. The framework places no inherent limitations on the type of physics that can be modeled, numerical techniques, or programming languages used to implement the equipment models, or the type or amount of data that can be exchanged between models. Tools are provided to analyze simulation results at multiple levels of detail, ranging from simple tabular outputs to advanced solution visualization methods. All models and tools communicate in a seamless manner. The framework can be coupled to other software frameworks that provide different modeling capabilities. Three software frameworks were developed during the course of the project. The first framework focused on simulating the performance of the DOE Low Emissions Boiler System Proof of Concept facility, an advanced pulverized-coal combustion-based power plant. The second framework targeted simulating the performance of an Integrated coal Gasification Combined Cycle - Fuel Cell Turbine (IGCC-FCT) plant configuration. The coal gasifier models included both CFD and process models for the commercially dominant systems. Interfacing models to the framework was performed using VES-Open, and tests were performed to demonstrate interfacing CAPE-Open compliant models to the framework. The IGCC-FCT framework was subsequently extended to support Virtual Engineering concepts in which plant configurations can be constructed and interrogated in a three-dimensional, user-centered, interactive

  2. Computer analysis of protein functional sites projection on exon structure of genes in Metazoa.

    Medvedeva, Irina V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2015-01-01

    Study of the relationship between the structural and functional organization of proteins and their coding genes is necessary for an understanding of the evolution of molecular systems and can provide new knowledge for many applications for designing proteins with improved medical and biological properties. It is well known that the functional properties of proteins are determined by their functional sites. Functional sites are usually represented by a small number of amino acid residues that are distantly located from each other in the amino acid sequence. They are highly conserved within their functional group and vary significantly in structure between such groups. According to this facts analysis of the general properties of the structural organization of the functional sites at the protein level and, at the level of exon-intron structure of the coding gene is still an actual problem. One approach to this analysis is the projection of amino acid residue positions of the functional sites along with the exon boundaries to the gene structure. In this paper, we examined the discontinuity of the functional sites in the exon-intron structure of genes and the distribution of lengths and phases of the functional site encoding exons in vertebrate genes. We have shown that the DNA fragments coding the functional sites were in the same exons, or in close exons. The observed tendency to cluster the exons that code functional sites which could be considered as the unit of protein evolution. We studied the characteristics of the structure of the exon boundaries that code, and do not code, functional sites in 11 Metazoa species. This is accompanied by a reduced frequency of intercodon gaps (phase 0) in exons encoding the amino acid residue functional site, which may be evidence of the existence of evolutionary limitations to the exon shuffling. These results characterize the features of the coding exon-intron structure that affect the functionality of the encoded protein and

  3. Computer Simulation Tests of Feedback Error Learning Controller with IDM and ISM for Functional Electrical Stimulation in Wrist Joint Control

    Watanabe, Takashi; Sugi, Yoshihiro

    2010-01-01

    Feedforward controller would be useful for hybrid Functional Electrical Stimulation (FES) system using powered orthotic devices. In this paper, Feedback Error Learning (FEL) controller for FES (FEL-FES controller) was examined using an inverse statics model (ISM) with an inverse dynamics model (IDM) to realize a feedforward FES controller. For FES application, the ISM was tested in learning off line using training data obtained by PID control of very slow movements. Computer simulation tests ...

  4. Computationally simple, analytic, closed form solution of the Coulomb self-interaction problem in Kohn Sham density functional theory

    Gonis, Antonios; Daene, Markus W.; Nicholson, Don M.; Stocks, George Malcolm

    2012-01-01

    We have developed and tested in terms of atomic calculations an exact, analytic and computationally simple procedure for determining the functional derivative of the exchange energy with respect to the density in the implementation of the Kohn Sham formulation of density functional theory (KS-DFT), providing an analytic, closed-form solution of the self-interaction problem in KS-DFT. We demonstrate the efficacy of our method through ground-state calculations of the exchange potential and energy for atomic He and Be atoms, and comparisons with experiment and the results obtained within the optimized effective potential (OEP) method.

  5. Bessel function expansion to reduce the calculation time and memory usage for cylindrical computer-generated holograms.

    Sando, Yusuke; Barada, Daisuke; Jackin, Boaz Jessie; Yatagai, Toyohiko

    2017-07-10

    This study proposes a method to reduce the calculation time and memory usage required for calculating cylindrical computer-generated holograms. The wavefront on the cylindrical observation surface is represented as a convolution integral in the 3D Fourier domain. The Fourier transformation of the kernel function involving this convolution integral is analytically performed using a Bessel function expansion. The analytical solution can drastically reduce the calculation time and the memory usage without any cost, compared with the numerical method using fast Fourier transform to Fourier transform the kernel function. In this study, we present the analytical derivation, the efficient calculation of Bessel function series, and a numerical simulation. Furthermore, we demonstrate the effectiveness of the analytical solution through comparisons of calculation time and memory usage.

  6. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  7. Can people with Alzheimer's disease improve their day-to-day functioning with a tablet computer?

    Imbeault, Hélène; Langlois, Francis; Bocti, Christian; Gagnon, Lise; Bier, Nathalie

    2018-07-01

    New technologies, such as tablet computers, present great potential to support the day-to-day living of persons with Alzheimer's disease (AD). However, whether people with AD can learn how to use a tablet properly in daily life remains to be demonstrated. A single case study was conducted with a 65-year-old woman with AD. A specific and structured intervention tailored to her needs was conceptualised for the use of a calendar application on a tablet computer according to the following learning stages: Acquisition, Application and Adaptation. In spite of her severe episodic memory deficit, she showed progressive learning of the tablet application during the intervention phase. Furthermore, data compiled over 12 months post-use show that she used the tablet successfully in her day-to-day life. She was even able to transfer her newly acquired ability to other available applications designed to monitor regular purchases, consult various recipes and play games. Tablet computers thereby offer a promising avenue for cognitive rehabilitation for persons with AD. This success was mainly achieved through a one-on-one individual programme tailored to this person. The limits and constraints of utilising tablet computers for persons with AD are discussed.

  8. Computers, Mass Media, and Schooling: Functional Equivalence in Uses of New Media.

    Lieberman, Debra A.; And Others

    1988-01-01

    Presents a study of 156 California eighth grade students which contrasted their recreational and intellectual computer use in terms of academic performance and use of other media. Among the conclusions were that recreational users watched television heavily and performed poorly in school, whereas intellectual users watched less television,…

  9. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  10. Computational models for interpretation of wave function imaging in cross-sectional STM of quantum dots

    Maksym, P.A.; Roy, M.; Wijnheijmer, A.P.; Koenraad, P.M.

    2008-01-01

    Computational models are used to investigate the role of electron-electron interactions in cross-sectional STM of cleaved quantum dots. If correlation effects are weak, the tunnelling current reflects the nodal structure of the non-interacting dot states. If correlation is strong, peaks in the

  11. Attention and executive functions computer training for attention-deficit/hyperactivity disorder (ADHD)

    Bikic, Aida; Leckman, James F; Christensen, Torben Ø

    2018-01-01

    and both groups received treatment as usual and were assessed in regard to cognitive functions, symptoms, behavioral and functional outcome measures after 8, 12 and 24 weeks. There was no significant effect on the primary outcome, sustained attention (β = - 0.047; CI - 0.247 to 0.153) or the secondary...

  12. A hybrid method for the parallel computation of Green’s functions

    Petersen, Dan Erik; Li, Song; Stokbro, Kurt

    2009-01-01

    Quantum transport models for nanodevices using the non-equilibrium Green’s function method require the repeated calculation of the block tridiagonal part of the Green’s and lesser Green’s function matrices. This problem is related to the calculation of the inverse of a sparse matrix. Because of t...

  13. Computation of piecewise affine terminal cost functions for model predictive control

    Brunner, F.D.; Lazar, M.; Allgöwer, F.; Fränzle, Martin; Lygeros, John

    2014-01-01

    This paper proposes a method for the construction of piecewise affine terminal cost functions for model predictive control (MPC). The terminal cost function is constructed on a predefined partition by solving a linear program for a given piecewise affine system, a stabilizing piecewise affine

  14. Applying Computational Scoring Functions to Assess Biomolecular Interactions in Food Science: Applications to the Estrogen Receptors

    Francesca Spyrakis

    2016-10-01

    Thus, key computational medicinal chemistry methods like molecular dynamics can be used to decipher protein flexibility and to obtain stable models for docking and scoring in food-related studies, and virtual screening is increasingly being applied to identify molecules with potential to act as endocrine disruptors, food mycotoxins, and new nutraceuticals [3,4,5]. All of these methods and simulations are based on protein-ligand interaction phenomena, and represent the basis for any subsequent modification of the targeted receptor's or enzyme's physiological activity. We describe here the energetics of binding of biological complexes, providing a survey of the most common and successful algorithms used in evaluating these energetics, and we report case studies in which computational techniques have been applied to food science issues. In particular, we explore a handful of studies involving the estrogen receptors for which we have a long-term interest.

  15. How to maintain hundreds of computers offering different functionalities with only 2 system administrators

    Krempaska, R.; Bertrand, A.; Higgs, C.; Kapeller, R.; Lutz, H.; Provenzano, M.

    2012-01-01

    At the Paul Scherrer Institute, the control systems of our large research facilities are maintained by the Controls section. These facilities include two proton accelerators, (HIPA and PROSCAN), two electron accelerators, (SLS and the Injector Test Facility of the future SwissFEL) as well as the control systems of all their related beamlines and test facilities. The control system configuration and applications for each facility is stored on independent NFS file servers. The total number of Linux computers and servers is about 500. Since only two system administrators are responsible for their installation, configuration and maintenance, we have adopted a well defined solution that relies on 3 ideas: -) Virtualization, -) Unified operating system installation and update mechanism, and -) Automatic configuration by a common tool (puppet). This paper describes methods and tools which are used to develop and maintain the challenging computing infrastructure deployed by the Controls section

  16. Using speech recognition to enhance the Tongue Drive System functionality in computer access.

    Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless tongue operated assistive technology (AT), which can enable people with severe physical disabilities to access computers and drive powered wheelchairs using their volitional tongue movements. TDS offers six discrete commands, simultaneously available to the users, for pointing and typing as a substitute for mouse and keyboard in computer access, respectively. To enhance the TDS performance in typing, we have added a microphone, an audio codec, and a wireless audio link to its readily available 3-axial magnetic sensor array, and combined it with a commercially available speech recognition software, the Dragon Naturally Speaking, which is regarded as one of the most efficient ways for text entry. Our preliminary evaluations indicate that the combined TDS and speech recognition technologies can provide end users with significantly higher performance than using each technology alone, particularly in completing tasks that require both pointing and text entry, such as web surfing.

  17. Metaanalysis of Diagnostic Performance of Computed Coronary Tomography Angiography, Computed Tomography Perfusion and Computed Tomography-Fractional Flow Reserve in Functional Myocardial Ischemia Assessment versus Invasive Fractional Flow Reserve

    Gonzalez, Jorge A.; Lipinski, Michael J.; Flors, Lucia F.; Shaw, Peter; Kramer, Christopher M.; Salerno, Michael

    2015-01-01

    We sought to compare the diagnostic performance of computed coronary tomography angiography (CCTA), computed tomography perfusion (CTP) and computed tomography fractional flow reserve (CT-FFR) for assessing the functional significance of coronary stenosis as defined by invasive fractional flow reserve (FFR), in patients with known or suspected coronary artery disease. CCTA has proven clinically useful for excluding obstructive CAD due to its high sensitivity and negative predictive value (NPV), however the ability of CTA to identify functionally significant CAD has remained challenging. We searched PubMed/Medline for studies evaluating CCTA, CTP or CT-FFR for the non-invasive detection of obstructive CAD as compared to catheter-derived FFR as the reference standard. Pooled sensitivity, specificity, PPV, NPV, likelihood ratios (LR), odds ratio (OR) of all diagnostic tests were assessed. Eighteen studies involving a total of 1535 patients were included. CTA demonstrated a pooled sensitivity of 0.92, specificity 0.43, PPV of 0.56 and NPV of 0.87 on a per-patient level. CT-FFR and CTP increased the specificity to 0.72 and 0.77 respectively (P=0.004 and P=0.0009)) resulting in higher point estimates for PPV 0.70 and 0.83 respectively. There was no improvement in the sensitivity. The CTP protocol involved more radiation (3.5 mSv CCTA VS 9.6 mSv CTP) and a higher volume of iodinated contrast (145 mL). In conclusion, CTP and CT-FFR improve the specificity of CCTA for detecting functionally significant stenosis as defined by invasive FFR on a per-patient level; both techniques could advance the ability to non-invasively detect the functional significance of coronary lesions. PMID:26347004

  18. Reexamining Computational Support for Intelligence Analysis: A Functional Design for a Future Capability

    2016-07-14

    Approaches to Computational Support 6.1 Paradigms and Methods In today’s open-world environment, historical paradigms and methods that rely on deep ...focused on methods of this type, with a deep basis on argumentation-based principles. One clear example of these remarks is shown in the writing of...Proceedings of the 5th 2013 Forum on Information Retrieval Evaluation [33] Dan Roth and Wen -tau Yih. Integer linear programming inference for

  19. Transaction processing in the common node of a distributed function laboratory computer system

    Stubblefield, F.W.; Dimmler, D.G.

    1975-01-01

    A computer network architecture consisting of a common node processor for managing peripherals and files and a number of private node processors for laboratory experiment control is briefly reviewed. Central to the problem of private node-common node communication is the concept of a transaction. The collection of procedures and the data structure associated with a transaction are described. The common node properties assigned to a transaction and procedures required for its complete processing are discussed. (U.S.)

  20. File management for experiment control parameters within a distributed function computer network

    Stubblefield, F.W.

    1976-10-01

    An attempt to design and implement a computer system for control of and data collection from a set of laboratory experiments reveals that many of the experiments in the set require an extensive collection of parameters for their control. The operation of the experiments can be greatly simplified if a means can be found for storing these parameters between experiments and automatically accessing them as they are required. A subsystem for managing files of such experiment control parameters is discussed. 3 figures

  1. Use of 4-Dimensional Computed Tomography-Based Ventilation Imaging to Correlate Lung Dose and Function With Clinical Outcomes

    Vinogradskiy, Yevgeniy; Castillo, Richard; Castillo, Edward; Tucker, Susan L.; Liao, Zhongxing; Guerrero, Thomas; Martel, Mary K.

    2013-01-01

    Purpose: Four-dimensional computed tomography (4DCT)-based ventilation is an emerging imaging modality that can be used in the thoracic treatment planning process. The clinical benefit of using ventilation images in radiation treatment plans remains to be tested. The purpose of the current work was to test the potential benefit of using ventilation in treatment planning by evaluating whether dose to highly ventilated regions of the lung resulted in increased incidence of clinical toxicity. Methods and Materials: Pretreatment 4DCT data were used to compute pretreatment ventilation images for 96 lung cancer patients. Ventilation images were calculated using 4DCT data, deformable image registration, and a density-change based algorithm. Dose–volume and ventilation-based dose function metrics were computed for each patient. The ability of the dose–volume and ventilation-based dose–function metrics to predict for severe (grade 3+) radiation pneumonitis was assessed using logistic regression analysis, area under the curve (AUC) metrics, and bootstrap methods. Results: A specific patient example is presented that demonstrates how incorporating ventilation-based functional information can help separate patients with and without toxicity. The logistic regression significance values were all lower for the dose–function metrics (range P=.093-.250) than for their dose–volume equivalents (range, P=.331-.580). The AUC values were all greater for the dose–function metrics (range, 0.569-0.620) than for their dose–volume equivalents (range, 0.500-0.544). Bootstrap results revealed an improvement in model fit using dose–function metrics compared to dose–volume metrics that approached significance (range, P=.118-.155). Conclusions: To our knowledge, this is the first study that attempts to correlate lung dose and 4DCT ventilation-based function to thoracic toxicity after radiation therapy. Although the results were not significant at the .05 level, our data suggests

  2. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  3. When can Empirical Green Functions be computed from Noise Cross-Correlations? Hints from different Geographical and Tectonic environments

    Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando

    2014-05-01

    Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of

  4. RNAdualPF: software to compute the dual partition function with sample applications in molecular evolution theory.

    Garcia-Martin, Juan Antonio; Bayegan, Amir H; Dotu, Ivan; Clote, Peter

    2016-10-19

    RNA inverse folding is the problem of finding one or more sequences that fold into a user-specified target structure s 0 , i.e. whose minimum free energy secondary structure is identical to the target s 0 . Here we consider the ensemble of all RNA sequences that have low free energy with respect to a given target s 0 . We introduce the program RNAdualPF, which computes the dual partition function Z ∗ , defined as the sum of Boltzmann factors exp(-E(a,s 0 )/RT) of all RNA nucleotide sequences a compatible with target structure s 0 . Using RNAdualPF, we efficiently sample RNA sequences that approximately fold into s 0 , where additionally the user can specify IUPAC sequence constraints at certain positions, and whether to include dangles (energy terms for stacked, single-stranded nucleotides). Moreover, since we also compute the dual partition function Z ∗ (k) over all sequences having GC-content k, the user can require that all sampled sequences have a precise, specified GC-content. Using Z ∗ , we compute the dual expected energy 〈E ∗ 〉, and use it to show that natural RNAs from the Rfam 12.0 database have higher minimum free energy than expected, thus suggesting that functional RNAs are under evolutionary pressure to be only marginally thermodynamically stable. We show that C. elegans precursor microRNA (pre-miRNA) is significantly non-robust with respect to mutations, by comparing the robustness of each wild type pre-miRNA sequence with 2000 [resp. 500] sequences of the same GC-content generated by RNAdualPF, which approximately [resp. exactly] fold into the wild type target structure. We confirm and strengthen earlier findings that precursor microRNAs and bacterial small noncoding RNAs display plasticity, a measure of structural diversity. We describe RNAdualPF, which rapidly computes the dual partition function Z ∗ and samples sequences having low energy with respect to a target structure, allowing sequence constraints and specified GC

  5. Computational studies at the density functional theory (DFT) level about the surface functionalization of hexagonal monolayers by chitosan monomer

    Ebrahimi, Javad; Ahangari, Morteza Ghorbanzadeh; Jahanshahi, Mohsen

    2018-05-01

    Theoretical investigations based on density functional theory have been carried out to understand the underlying interactions between the chitosan monomer and several types of hexagonal monolayers consisting of pristine and defected graphene and boron-nitride nanosheets. Based on the obtained results, it was found that the type of the interaction for all the systems is of non-covalent nature and the chitosan monomer physically interacts with the surface of mentioned nanostructures. The interaction strength was evaluated by calculating the adsorption energies for the considered systems and it was found that the adsorption of chitosan monomer accompanies by the release of about -0.67 and -0.66 eV energy for pristine graphene and h-BN monolayer, respectively. The role of structural defect has also been considered by embedding a Stone-Wales defect within the structure of mentioned monolayers and it was found that the introduced defect enhances the interactions between the chitosan monomer and nanostructures. The role of dispersion interactions has also been taken into account and it was found that these long-range interactions play the dominating role in the attachment of chitosan monomer onto the graphene sheet, while having strong contribution together with the electrostatic interactions for the stabilization of chitosan onto the surface of h-BN monolayer. For all the cases, the adsorption of chitosan monomer did not change the inherent electronic properties of the nanostructures based on the results of charge transfer analysis and energy gap calculations. The findings of the present work would be very useful in future investigations to explore the potential applications of these hybrid materials in materials science and bio-related fields.

  6. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  7. Effects of surface functionalization on the electronic and structural properties of carbon nanotubes: A computational approach

    Ribeiro, M. S.; Pascoini, A. L.; Knupp, W. G.; Camps, I.

    2017-12-01

    Carbon nanotubes (CNTs) have important electronic, mechanical and optical properties. These features may be different when comparing a pristine nanotube with other presenting its surface functionalized. These changes can be explored in areas of research and application, such as construction of nanodevices that act as sensors and filters. Following this idea, in the current work, we present the results from a systematic study of CNT's surface functionalized with hydroxyl and carboxyl groups. Using the entropy as selection criterion, we filtered a library of 10k stochastically generated complexes for each functional concentration (5, 10, 15, 20 and 25%). The structurally related parameters (root-mean-square deviation, entropy, and volume/area) have a monotonic relationship with functionalization concentration. Differently, the electronic parameters (frontier molecular orbital energies, electronic gap, molecular hardness, and electrophilicity index) present and oscillatory behavior. For a set of concentrations, the nanotubes present spin polarized properties that can be used in spintronics.

  8. INTEGRATION OF INFORMATIONAL COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE MAIN FUNCTIONS OF THE TECHNICAL CONTROL DEPARTMENT

    S. A. Pavlenko

    2010-01-01

    Full Text Available It is shown that automation of some functions of control department allows to record defects, reclamations and failures of technology, to make the necessary reporting forms and quality certificates for production.

  9. Motor circuit computer model based on studies of functional Nuclear Magnetic Resonance

    Garcia Ramo, Karla Batista; Rodriguez Rojas, Rafael; Carballo Barreda, Maylen

    2012-01-01

    The basal ganglia are a complex network of subcortical nuclei involved in motor control, sensorimotor integration, and cognitive processes. Their functioning and interaction with other cerebral structures remains as a subject of debate. The aim of the present work was to simulate the basal ganglia-thalamus-cortex circuitry interaction in motor program selection, supported by functional connectivity pattern obtained by functional nuclear magnetic resonance imaging. Determination of connections weights between neural populations by functional magnetic resonance imaging, contributed to a more realistic formulation of the model; and consequently to obtain similar results to clinical and experimental data. The network allowed to describe the participation of the basal ganglia in motor program selection and the changes in Parkinson disease. The simulation allowed to demonstrate that dopamine depletion above to 40 % leads to a loss of action selection capability, and to reflect the system adaptation ability to compensate dysfunction in Parkinson disease, coincident with experimental and clinical studies

  10. A method for computing the stationary points of a function subject to linear equality constraints

    Uko, U.L.

    1989-09-01

    We give a new method for the numerical calculation of stationary points of a function when it is subject to equality constraints. An application to the solution of linear equations is given, together with a numerical example. (author). 5 refs

  11. BrEPS: a flexible and automatic protocol to compute enzyme-specific sequence profiles for functional annotation

    Schomburg D

    2010-12-01

    Full Text Available Abstract Background Models for the simulation of metabolic networks require the accurate prediction of enzyme function. Based on a genomic sequence, enzymatic functions of gene products are today mainly predicted by sequence database searching and operon analysis. Other methods can support these techniques: We have developed an automatic method "BrEPS" that creates highly specific sequence patterns for the functional annotation of enzymes. Results The enzymes in the UniprotKB are identified and their sequences compared against each other with BLAST. The enzymes are then clustered into a number of trees, where each tree node is associated with a set of EC-numbers. The enzyme sequences in the tree nodes are aligned with ClustalW. The conserved columns of the resulting multiple alignments are used to construct sequence patterns. In the last step, we verify the quality of the patterns by computing their specificity. Patterns with low specificity are omitted and recomputed further down in the tree. The final high-quality patterns can be used for functional annotation. We ran our protocol on a recent Swiss-Prot release and show statistics, as well as a comparison to PRIAM, a probabilistic method that is also specialized on the functional annotation of enzymes. We determine the amount of true positive annotations for five common microorganisms with data from BRENDA and AMENDA serving as standard of truth. BrEPS is almost on par with PRIAM, a fact which we discuss in the context of five manually investigated cases. Conclusions Our protocol computes highly specific sequence patterns that can be used to support the functional annotation of enzymes. The main advantages of our method are that it is automatic and unsupervised, and quite fast once the patterns are evaluated. The results show that BrEPS can be a valuable addition to the reconstruction of metabolic networks.

  12. Computational structural and functional analysis of hypothetical proteins of Staphylococcus aureus

    Mohan, Ramadevi; Venugopal, Subhashree

    2012-01-01

    Genome sequencing projects has led to an explosion of large amount of gene products in which many are of hypothetical proteins with unknown function. Analyzing and annotating the functions of hypothetical proteins is important in Staphylococcus aureus which is a pathogenic bacterium that cause multiple types of diseases by infecting various sites in humans and animals. In this study, ten hypothetical proteins of Staphylococcus aureus were retrieved from NCBI and analyzed for their structural ...

  13. Computing the real-time Green's Functions of large Hamiltonian matrices

    Iitaka, Toshiaki

    1998-01-01

    A numerical method is developed for calculating the real time Green's functions of very large sparse Hamiltonian matrices, which exploits the numerical solution of the inhomogeneous time-dependent Schroedinger equation. The method has a clear-cut structure reflecting the most naive definition of the Green's functions, and is very suitable to parallel and vector supercomputers. The effectiveness of the method is illustrated by applying it to simple lattice models. An application of this method...

  14. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  15. Computer versus Compensatory Calendar Training in Individuals with Mild Cognitive Impairment: Functional Impact in a Pilot Study.

    Chandler, Melanie J; Locke, Dona E C; Duncan, Noah L; Hanna, Sherrie M; Cuc, Andrea V; Fields, Julie A; Hoffman Snyder, Charlene R; Lunde, Angela M; Smith, Glenn E

    2017-09-06

    This pilot study examined the functional impact of computerized versus compensatory calendar training in cognitive rehabilitation participants with mild cognitive impairment (MCI). Fifty-seven participants with amnestic MCI completed randomly assigned calendar or computer training. A standard care control group was used for comparison. Measures of adherence, memory-based activities of daily living (mADLs), and self-efficacy were completed. The calendar training group demonstrated significant improvement in mADLs compared to controls, while the computer training group did not. Calendar training may be more effective in improving mADLs than computerized intervention. However, this study highlights how behavioral trials with fewer than 30-50 participants per arm are likely underpowered, resulting in seemingly null findings.

  16. Computed functional analysis of 99mTc EHIDA kinetics in patients

    Blaha, V.; Cihak, I; Nicek, F; Horak, J.

    1987-01-01

    It is presented a method of EHIDA (dietyl-imino-acetanilido-diacetic acid) kinetic analysing in patients, particularly the kinetic in the hepatic parenchima. A group of 367 patients with different hepatobiliary or other gastrointestinal deseases, was examined and each studied was quantified either in whole extent or at least partially. The scintigraphy is made with several small modifications of the commonly known methods. The hepatic curve is analysed by a computer programme. The results obtained in the whole group of patients were submitted to a statistical evaluation to obtain general conclusions. (M.E.L.) [es

  17. Computer-aided training sensorimotor cortex functions in humans before the upper limb transplantation using virtual reality and sensory feedback.

    Kurzynski, Marek; Jaskolska, Anna; Marusiak, Jaroslaw; Wolczowski, Andrzej; Bierut, Przemyslaw; Szumowski, Lukasz; Witkowski, Jerzy; Kisiel-Sajewicz, Katarzyna

    2017-08-01

    One of the biggest problems of upper limb transplantation is lack of certainty as to whether a patient will be able to control voluntary movements of transplanted hands. Based on findings of the recent research on brain cortex plasticity, a premise can be drawn that mental training supported with visual and sensory feedback can cause structural and functional reorganization of the sensorimotor cortex, which leads to recovery of function associated with the control of movements performed by the upper limbs. In this study, authors - based on the above observations - propose the computer-aided training (CAT) system, which generating visual and sensory stimuli, should enhance the effectiveness of mental training applied to humans before upper limb transplantation. The basis for the concept of computer-aided training system is a virtual hand whose reaching and grasping movements the trained patient can observe on the VR headset screen (visual feedback) and whose contact with virtual objects the patient can feel as a touch (sensory feedback). The computer training system is composed of three main components: (1) the system generating 3D virtual world in which the patient sees the virtual limb from the perspective as if it were his/her own hand; (2) sensory feedback transforming information about the interaction of the virtual hand with the grasped object into mechanical vibration; (3) the therapist's panel for controlling the training course. Results of the case study demonstrate that mental training supported with visual and sensory stimuli generated by the computer system leads to a beneficial change of the brain activity related to motor control of the reaching in the patient with bilateral upper limb congenital transverse deficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Computational screening of functional groups for capture of toxic industrial chemicals in porous materials.

    Kim, Ki Chul; Fairen-Jimenez, David; Snurr, Randall Q

    2017-12-06

    A thermodynamic analysis using quantum chemical methods was carried out to identify optimal functional group candidates that can be included in metal-organic frameworks and activated carbons for the selective capture of toxic industrial chemicals (TICs) in humid air. We calculated the binding energies of 14 critical TICs plus water with a series of 10 functional groups attached to a naphthalene ring model. Using vibrational calculations, the free energies of adsorption were calculated in addition to the binding energies. Our results show that, in these systems, the binding energies and free energies follow similar trends. We identified copper(i) carboxylate as the optimal functional group (among those studied) for the selective binding of the majority of the TICs in humid air, and this functional group exhibits especially strong binding for sulfuric acid. Further thermodynamic analysis shows that the presence of water weakens the binding strength of sulfuric acid with the copper carboxylate group. Our calculations predict that functionalization of aromatic rings would be detrimental to selective capture of COCl 2 , CO 2 , and Cl 2 under humid conditions. Finally, we found that forming an ionic complex, H 3 O + HSO 4 - , between H 2 SO 4 and H 2 O via proton transfer is not favorable on copper carboxylate.

  19. Partial covariance based functional connectivity computation using Ledoit-Wolf covariance regularization.

    Brier, Matthew R; Mitra, Anish; McCarthy, John E; Ances, Beau M; Snyder, Abraham Z

    2015-11-01

    Functional connectivity refers to shared signals among brain regions and is typically assessed in a task free state. Functional connectivity commonly is quantified between signal pairs using Pearson correlation. However, resting-state fMRI is a multivariate process exhibiting a complicated covariance structure. Partial covariance assesses the unique variance shared between two brain regions excluding any widely shared variance, hence is appropriate for the analysis of multivariate fMRI datasets. However, calculation of partial covariance requires inversion of the covariance matrix, which, in most functional connectivity studies, is not invertible owing to rank deficiency. Here we apply Ledoit-Wolf shrinkage (L2 regularization) to invert the high dimensional BOLD covariance matrix. We investigate the network organization and brain-state dependence of partial covariance-based functional connectivity. Although RSNs are conventionally defined in terms of shared variance, removal of widely shared variance, surprisingly, improved the separation of RSNs in a spring embedded graphical model. This result suggests that pair-wise unique shared variance plays a heretofore unrecognized role in RSN covariance organization. In addition, application of partial correlation to fMRI data acquired in the eyes open vs. eyes closed states revealed focal changes in uniquely shared variance between the thalamus and visual cortices. This result suggests that partial correlation of resting state BOLD time series reflect functional processes in addition to structural connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Assessment of an extended Nijboer-Zernike approach for the computation of optical point-spread functions.

    Braat, Joseph; Dirksen, Peter; Janssen, Augustus J E M

    2002-05-01

    We assess the validity of an extended Nijboer-Zernike approach [J. Opt. Soc. Am. A 19, 849 (2002)], based on ecently found Bessel-series representations of diffraction integrals comprising an arbitrary aberration and a defocus part, for the computation of optical point-spread functions of circular, aberrated optical systems. These new series representations yield a flexible means to compute optical point-spread functions, both accurately and efficiently, under defocus and aberration conditions that seem to cover almost all cases of practical interest. Because of the analytical nature of the formulas, there are no discretization effects limiting the accuracy, as opposed to the more commonly used numerical packages based on strictly numerical integration methods. Instead, we have an easily managed criterion, expressed in the number of terms to be included in the Bessel-series representations, guaranteeing the desired accuracy. For this reason, the analytical method can also serve as a calibration tool for the numerically based methods. The analysis is not limited to pointlike objects but can also be used for extended objects under various illumination conditions. The calculation schemes are simple and permit one to trace the relative strength of the various interfering complex-amplitude terms that contribute to the final image intensity function.

  1. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  2. Simulation on a computer the cascade probabilistic functions and theirs relation with Markov's processes

    Kupchishin, A.A.; Kupchishin, A.I.; Shmygaleva, T.A.

    2002-01-01

    Within framework of the cascade-probabilistic (CP) method the radiation and physical processes are studied, theirs relation with Markov's processes are found. The conclusion that CP-function for electrons, protons, alpha-particles and ions are describing by unhomogeneous Markov's chain is drawn. The algorithms are developed, the CP-functions calculations for charged particles, concentration of radiation defects in solids at ion irradiation are carried out as well. Tables for CPF different parameters and radiation defects concentration at charged particle interaction with solids are given. The book consists of the introduction and two chapters: (1) Cascade probabilistic function and the Markov's processes; (2) Radiation defects formation in solids as a part of the Markov's processes. The book is intended for specialists on the radiation defects mathematical stimulation, solid state physics, elementary particles physics and applied mathematics

  3. Clinical studies of brain functional images by motor activation using single photon emission computed tomography

    Kawaguchi, Masahiro [Gifu Univ. (Japan). School of Medicine

    1998-09-01

    Thirty participants (10 normal controls; group A, 5 patients with brain tumors located near central sulcus without hemiparesis; group B, 10 patients with brain tumors located near central sulcus with hemiparesis; group C, and 5 patients with brain tumors besides the central regions with hemiparesis; group D) were enrolled. The images were performed by means of split-dose method with {sup 99m}Tc-ECD at rest condition (SPECT 1) and during hand grasping (SPECT 2). The activation SPECT were obtained by subtracting SPECT 1 from SPECT 2, and the functional mapping was made by the strict registration of the activation SPECT with 3D MRI. To evaluate the changes of CBF (%{Delta}CBF) of the sensorimotor and supplementary motor areas on the functional mapping, ratio of the average counts of SPECT 1 and SPECT 2 was calculated and statistically compared. The functional activation paradigms caused a significant increase of CBF in the sensorimotor area contra-lateral to the stimulated hand, although the sensorimotor area and the central sulcus in groups B and C were dislocated, compared with hemisphere of non-tumor side. The sensorimotor area ipsi-lateral to the stimulated hand could be detected in almost of all subjects. The supplementary motor area could be detected in all subjects. In group A, the average %{Delta}CBF were up 24.1{+-}4.3% in the contra-lateral sensorimotor area, and 22.3{+-}3.6% in the supplementary motor area, respectively. The average %{Delta}CBF in the contra-lateral sensorimotor area of group D was significantly higher than that of group A. The brain functional mapping by motor activation using SPECT could localize the area of cortical motor function in normal volunteers and patients with brain tumors. The changes of regional CBF by activation SPECT precisely assess the cortical motor function even in patients with brain tumors located near central sulcus. (K.H.)

  4. Clinical studies of brain functional images by motor activation using single photon emission computed tomography

    Kawaguchi, Masahiro

    1998-01-01

    Thirty participants (10 normal controls; group A, 5 patients with brain tumors located near central sulcus without hemiparesis; group B, 10 patients with brain tumors located near central sulcus with hemiparesis; group C, and 5 patients with brain tumors besides the central regions with hemiparesis; group D) were enrolled. The images were performed by means of split-dose method with 99m Tc-ECD at rest condition (SPECT 1) and during hand grasping (SPECT 2). The activation SPECT were obtained by subtracting SPECT 1 from SPECT 2, and the functional mapping was made by the strict registration of the activation SPECT with 3D MRI. To evaluate the changes of CBF (%ΔCBF) of the sensorimotor and supplementary motor areas on the functional mapping, ratio of the average counts of SPECT 1 and SPECT 2 was calculated and statistically compared. The functional activation paradigms caused a significant increase of CBF in the sensorimotor area contra-lateral to the stimulated hand, although the sensorimotor area and the central sulcus in groups B and C were dislocated, compared with hemisphere of non-tumor side. The sensorimotor area ipsi-lateral to the stimulated hand could be detected in almost of all subjects. The supplementary motor area could be detected in all subjects. In group A, the average %ΔCBF were up 24.1±4.3% in the contra-lateral sensorimotor area, and 22.3±3.6% in the supplementary motor area, respectively. The average %ΔCBF in the contra-lateral sensorimotor area of group D was significantly higher than that of group A. The brain functional mapping by motor activation using SPECT could localize the area of cortical motor function in normal volunteers and patients with brain tumors. The changes of regional CBF by activation SPECT precisely assess the cortical motor function even in patients with brain tumors located near central sulcus. (K.H.)

  5. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular

  6. Computational Approaches Reveal New Insights into Regulation and Function of Non; coding RNAs and their Targets

    Alam, Tanvir

    2016-01-01

    Regulation and function of protein-coding genes are increasingly well-understood, but no comparable evidence exists for non-coding RNA (ncRNA) genes, which appear to be more numerous than protein-coding genes. We developed a novel machine

  7. Syntactic Complexity Metrics and the Readability of Programs in a Functional Computer Language

    van den Berg, Klaas; Engel, F.L.; Bouwhuis, D.G.; Bosser, T.; d'Ydewalle, G.

    This article reports on the defintion and the measutement of the software complexity metrics of Halstead and McCabe for programs written in the functional programming language Miranda. An automated measurement of these metrics is described. In a case study, the correlation is established between the

  8. Computation of covex bounds for present value functions with random payments

    Ahcan, A.; Darkiewicz, G.; Goovaerts, M.J.; Hoedemakers, T.

    2006-01-01

    In this contribution we study the distribution of the present value function of a series of random payments in a stochastic financial environment. Such distributions occur naturally in a wide range of applications within fields of insurance and finance. We obtain accurate approximations by

  9. Simulation-based computation of the workload correlation function in a Lévy-driven queue

    Glynn, P.W.; Mandjes, M.

    2011-01-01

    In this paper we consider a single-server queue with Lévy input, and, in particular, its workload process (Qt)t≥0, focusing on its correlation structure. With the correlation function defined as r(t):= cov(Q0, Qt) / varQ0 (assuming that the workload process is in stationarity at time 0), we first

  10. Simulation-based computation of the workload correlation function in a Levy-driven queue

    P. Glynn; M.R.H. Mandjes (Michel)

    2009-01-01

    htmlabstractIn this paper we consider a single-server queue with Levy input, and in particular its workload process (Q_t), focusing on its correlation structure. With the correlation function defined as r(t) := Cov(Q_0, Q_t)/Var Q_0 (assuming the workload process is in stationarity at time 0), we

  11. Simulation-based computation of the workload correlation function in a Lévy-driven queue

    P. Glynn; M.R.H. Mandjes (Michel)

    2010-01-01

    htmlabstractIn this paper we consider a single-server queue with Levy input, and in particular its workload process (Q_t), focusing on its correlation structure. With the correlation function defined as r(t) := Cov(Q_0,Q_t)/Var(Q_0) (assuming the workload process is in stationarity at time 0), we

  12. Computed Tomography Volumetry in Preoperative Living Kidney Donor Assessment for Prediction of Split Renal Function.

    Wahba, Roger; Franke, Mareike; Hellmich, Martin; Kleinert, Robert; Cingöz, Tülay; Schmidt, Matthias C; Stippel, Dirk L; Bangard, Christopher

    2016-06-01

    Transplant centers commonly evaluate split renal function (SRF) with Tc-99m-mercapto-acetyltriglycin (MAG3) scintigraphy in living kidney donation. Alternatively, the kidney volume can be measured based on predonation CT scans. The aim of this study was to identify the most accurate CT volumetry technique for SRF and the prediction of postdonation kidney function (PDKF). Three CT volumetry techniques (modified ellipsoid volume [MELV], smart region of interest [ROI] volume, renal cortex volume [RCV]) were performed in 101 living kidney donors. Preoperation CT volumetric SRF was determined and compared with MAG3-SRF, postoperation donor kidney function, and graft function. The correlation between donors predonation total kidney volume and predonation kidney function was the highest for RCV (0.58 with creatine clearance, 0.54 with estimated glomerular filtration rate-Cockcroft-Gault). The predonation volume of the preserved kidney was (ROI, MELV, RCV) 148.0 ± 29.1 cm, 151.2 ± 35.4 and 93.9 ± 25.2 (P volumetry SRF and MAG3-SRF (bias, 95% limits of agreement: ROI vs MAG3 0.4%, -7.7% to 8.6%; MELV vs MAG3 0.4%, -8.9% to 9.7%; RCV vs MAG3 0.8%, -9.1% to 10.7%). The correlation between predonation CT volumetric SRF of the preserved kidney and PDKF at day 3 was r = 0.85 to 0.88, between MAG3-SRF and PDKF (r = 0.84). The difference of predonation SRF between preserved and donated kidney was the lowest for ROI and RCV (median, 3% and 4%; 95th percentile, 9% and 13%). Overall renal cortex volumetry seems to be the most accurate technique for the evaluation of predonation SRF and allows a reliable prediction of donor's PDKF.

  13. Quantitative computed tomography for the prediction of pulmonary function after lung cancer surgery: a simple method using simulation software.

    Ueda, Kazuhiro; Tanaka, Toshiki; Li, Tao-Sheng; Tanaka, Nobuyuki; Hamano, Kimikazu

    2009-03-01

    The prediction of pulmonary functional reserve is mandatory in therapeutic decision-making for patients with resectable lung cancer, especially those with underlying lung disease. Volumetric analysis in combination with densitometric analysis of the affected lung lobe or segment with quantitative computed tomography (CT) helps to identify residual pulmonary function, although the utility of this modality needs investigation. The subjects of this prospective study were 30 patients with resectable lung cancer. A three-dimensional CT lung model was created with voxels representing normal lung attenuation (-600 to -910 Hounsfield units). Residual pulmonary function was predicted by drawing a boundary line between the lung to be preserved and that to be resected, directly on the lung model. The predicted values were correlated with the postoperative measured values. The predicted and measured values corresponded well (r=0.89, plung cancer surgery and helped to identify patients whose functional reserves are likely to be underestimated. Hence, this modality should be utilized for patients with marginal pulmonary function.

  14. On the limits of computational functional genomics for bacterial lifestyle prediction

    Barbosa, Eudes; Röttger, Richard; Hauschild, Anne-Christin

    2014-01-01

    We review the level of genomic specificity regarding actinobacterial pathogenicity. As they occupy various niches in diverse habitats, one may assume the existence of lifestyle-specific genomic features. We include 240 actinobacteria classified into four pathogenicity classes: human pathogens (HPs...... of an observation bias, i.e. many HPs might yet be unclassified BPs. (H4) There is no intrinsic genomic characteristic of OPs compared with pathogens, as small mutations are likely to play a more dominant role to survive the immune system. To study these hypotheses, we implemented a bioinformatics pipeline...... that combines evolutionary sequence analysis with statistical learning methods (Random Forest with feature selection, model tuning and robustness analysis). Essentially, we present orthologous gene sets that computationally distinguish pathogens from NPs (H1). We further show a clear limit in differentiating...

  15. Tongue controlled computer game: A new approach for rehabilitation of tongue motor function

    Kothari, Mohit; Svensson, Peter; Jensen, Jim

    2014-01-01

    Objective: To investigate the influence of tongue-disability, age and gender on motor performance for a tongue training paradigm involving playing a computer game using the Tongue Drive System (TDS). Design: Two controlled observational studies. Setting: A neurorehabilitation center and a dental...... school. Participants: In Study 1, eleven tongue-disabled patients with symptoms of dysphagia and dysarthria and 11 age-and sex-matched controls participated in tongue training. In Study 2, 16 healthy elderly and 16 healthy young participants volunteered. Intervention: In study 1 and study 2, the tongue....... Subject-based reports of motivation, fun, pain and fatigue evaluated on 0-10 numerical rating scales (NRS) were compared between groups. Results: In study 1, tongue-disabled patients performed poorer than healthy controls (P=0.005) and with a trend of a gender difference (P=0.046). In study 2, healthy...

  16. Tongue-controlled computer game: a new approach for rehabilitation of tongue motor function.

    Kothari, Mohit; Svensson, Peter; Jensen, Jim; Holm, Trine Davidsen; Nielsen, Mathilde Skorstengaard; Mosegaard, Trine; Nielsen, Jørgen Feldbæk; Ghovanloo, Maysam; Baad-Hansen, Lene

    2014-03-01

    To investigate the influence of tongue disability, age, and sex on motor performance for a tongue-training paradigm involving playing a computer game using the Tongue Drive System (TDS). Two controlled observational studies. A neurorehabilitation center and a dental school. In study 1, tongue-disabled patients with symptoms of dysphagia and dysarthria (n=11) and age- and sex-matched controls (n=11) participated in tongue training. In study 2, healthy elderly persons (n=16) and healthy young persons (n=16) volunteered. In study 1 and study 2, the tongue training lasted 30 and 40 minutes, respectively. Participants were instructed to play a computer game with the tongue using TDS. Motor performance was compared between groups in both studies. Correlation analyses were performed between age and relative improvement in performance. Subject-based reports of motivation, fun, pain, and fatigue evaluated on 0-to-10 numeric rating scales were compared between groups. In study 1, tongue-disabled patients performed poorer than healthy controls (P=.005) and with a trend of a sex difference (P=.046). In study 2, healthy young participants performed better than healthy elderly participants (Peffect of sex (P=.140). There was a significant negative correlation between age and relative improvement in performance (δ=-.450; P=.009). There were no significant differences in subject-based reports of motivation, fun, pain, and fatigue between groups in any of the studies (P>.094). The present study provides evidence that tongue disability and age can influence behavioral measures of tongue motor performance. TDS may be a new adjunctive neurorehabilitation regimen in treating tongue-disabled patients. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  17. Computation of the Spitzer function in stellarators and tokamaks with finite collisionality

    Kernbichler Winfried

    2015-01-01

    Full Text Available The generalized Spitzer function, which determines the current drive efficiency in toka- maks and stellarators is modelled for finite plasma collisionality with help of the drift kinetic equation solver NEO-2 [1]. The effect of finite collisionality on the global ECCD efficiency in a tokamak is studied using results of the code NEO-2 as input to the ray tracing code TRAVIS [2]. As it is known [3], specific features of the generalized Spitzer function, which are absent in asymptotic (collisionless or highly collisional regimes result in current drive from a symmetric microwave spectrum with respect to parallel wave numbers. Due to this effect the direction of the current may become independent of the microwave beam launch angle in advanced ECCD scenarii (O2 and X3 where due to relatively low optical depth a significant amount of power is absorbed by trapped particles.

  18. Computed tomography assessment of intestinal gas volumes in functional gastrointestinal disorders.

    Mc Williams, Sebastian R; Mc Laughlin, Patrick D; O'Connor, Owen J; Desmond, Alan N; Ní Laoíre, Aine; Shanahan, Fergus; Quigley, Eamonn Mm; Maher, Michael M

    2012-10-01

    Many patients with functional gastrointestinal disorders (FGIDs) rank sensations of bloating and distension among their most debilitating symptoms. Previous studies that have examined intestinal gas volume (IGV) in patients with FGIDs have employed a variety of invasive and imaging techniques. These studies are limited by small numbers and have shown conflicting results. The aim of our study was to estimate, using CT of the abdomen and pelvis (CTAP), IGV in patients attending FGID clinic and to compare IGV in patients with and without FGID. All CTAP (n = 312) performed on patients (n = 207) attending a specialized FGID clinic over 10-year period were included in this study. Patients were classified into one of 3 groups according to the established clinical grading system, as organic gastrointestinal disorder (OGID, ie, patients with an organic non-functional disorder, n = 84), FGID (n = 36) or organic and functional gastrointestinal disorder (OFGID, ie, patients with an organic and a functional disorder, n = 87). Two independent readers blinded to the diagnostic group calculated IGV using threshold based 3D region growing with OsiriX. Median IGVs for the FGID, OGID, and OFGID groups were 197.6, 220.6 and 155.0 mL, respectively. Stepwise linear regression revealed age at study, gender, and calculated body mass index to predict the log IGV with an r(2) of 0.116, and P IGV in OGID (Spearman's = 0.253, P = 0.02) but this correlation was non-significant in the other groups. Although bloating is a classic symptom in FGID patients, IGV may not be increased compared with OGID and OFGID patients.

  19. Guaranteed and computable bounds of the limit load for variational problems with linear growth energy functionals

    Haslinger, Jaroslav; Repin, S.; Sysala, Stanislav

    2016-01-01

    Roč. 61, č. 5 (2016), s. 527-564 ISSN 0862-7940 R&D Projects: GA MŠk LQ1602 Institutional support: RVO:68145535 Keywords : functionals with linear growth * limit load * truncation method * perfect plasticity Subject RIV: BA - General Mathematics Impact factor: 0.618, year: 2016 http://link.springer.com/article/10.1007/s10492-016-0146-6

  20. Efficient Server-Aided Secure Two-Party Function Evaluation with Applications to Genomic Computation

    2016-07-14

    sense that they can be used to securely evaluate arbitrary functions and offer attrac- tive performance compared to the state of the art . We apply the...seconds. The performance favor- ably compares to the state of the art (as detailed in section 7), in some cases achieving orders of magnitude...148 Table 1. Complexity of constructions in prior and our work. Party Communication Sym. key/hash op . Public key operations Security model [38], A O

  1. Computational modeling of the structure-function relationship in human placental terminal villi.

    Plitman, Mayo R; Olsthoorn, Jason; Charnock-Jones, David Stephen; Burton, Graham James; Oyen, Michelle Lynn

    2016-01-01

    Placental oxygen transport takes place at the final branches of the villous tree and is dictated by the relative arrangement of the maternal and fetal circulations. Modeling techniques have failed to accurately assess the structure-function relationship in the terminal villi due to the geometrical complexity. Three-dimensional blood flow and oxygen transport was modeled in four terminal villi reconstructed from confocal image stacks. The blood flow was analyzed along the center lines of capil...

  2. Response functions for computing absorbed dose to skeletal tissues from photon irradiation-an update

    Johnson, Perry B; Bahadori, Amir A [Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Eckerman, Keith F [Life Sciences Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Lee, Choonsik [Radiation Epidemiology Branch, National Cancer Institute, Bethesda, MD 20892 (United States); Bolch, Wesley E, E-mail: wbolch@ufl.edu [Nuclear and Radiological/Biomedical Engineering, University of Florida, Gainesville, FL 32611 (United States)

    2011-04-21

    A comprehensive set of photon fluence-to-dose response functions (DRFs) is presented for two radiosensitive skeletal tissues-active and total shallow marrow-within 15 and 32 bone sites, respectively, of the ICRP reference adult male. The functions were developed using fractional skeletal masses and associated electron-absorbed fractions as reported for the UF hybrid adult male phantom, which in turn is based upon micro-CT images of trabecular spongiosa taken from a 40 year male cadaver. The new DRFs expand upon both the original set of seven functions produced in 1985, and a 2007 update calculated under the assumption of secondary electron escape from spongiosa. In this study, it is assumed that photon irradiation of the skeleton will yield charged particle equilibrium across all spongiosa regions at energies exceeding 200 keV. Kerma coefficients for active marrow, inactive marrow, trabecular bone and spongiosa at higher energies are calculated using the DRF algorithm setting the electron-absorbed fraction for self-irradiation to unity. By comparing kerma coefficients and DRF functions, dose enhancement factors and mass energy-absorption coefficient (MEAC) ratios for active marrow to spongiosa were derived. These MEAC ratios compared well with those provided by the NIST Physical Reference Data Library (mean difference of 0.8%), and the dose enhancement factors for active marrow compared favorably with values calculated in the well-known study published by King and Spiers (1985 Br. J. Radiol. 58 345-56) (mean absolute difference of 1.9 percentage points). Additionally, dose enhancement factors for active marrow were shown to correlate well with the shallow marrow volume fraction (R{sup 2} = 0.91). Dose enhancement factors for the total shallow marrow were also calculated for 32 bone sites representing the first such derivation for this target tissue.

  3. Comparison of high resolution computed tomography and pulmonary function tests in diagnosis of mild emphysema

    Kuwano, Kazuyoshi; Matsuba, Kenichi; Ikeda, Togo

    1989-01-01

    To assess the ability of high resolution CT scan and pulmonary function tests in detecting and grading mild emphysema, we correlated the high resolution CT scan and pulmonary function tests with the pathologic grade of emphysema and the destructive index of lung specimens from 42 patients undergoing thoracotomy for solitary pulmonary nodules. Using the high resolution CT scan, we could identify the pathologic grade of mild and moderate emphysema. By measuring diffusing capacity per unit alveolar gas volume (DLco/VA), it seemed to be possible to detect the mildest degree of alveolar destruction assessed by the destructive index, which was not detected by high resolution CT scan. The reason for these results seemed to be that we assessed the severity of emphysema by detecting the air space enlargement on high resolution CT scan images caused by the destruction of alveolar walls, which were detectable by measuring DLco/VA. We conclude that it is possible to detect mild emphysema using the combination of high resolution CT scan and pulmomary function tests. (author)

  4. A computational platform to maintain and migrate manual functional annotations for BioCyc databases.

    Walsh, Jesse R; Sen, Taner Z; Dickerson, Julie A

    2014-10-12

    BioCyc databases are an important resource for information on biological pathways and genomic data. Such databases represent the accumulation of biological data, some of which has been manually curated from literature. An essential feature of these databases is the continuing data integration as new knowledge is discovered. As functional annotations are improved, scalable methods are needed for curators to manage annotations without detailed knowledge of the specific design of the BioCyc database. We have developed CycTools, a software tool which allows curators to maintain functional annotations in a model organism database. This tool builds on existing software to improve and simplify annotation data imports of user provided data into BioCyc databases. Additionally, CycTools automatically resolves synonyms and alternate identifiers contained within the database into the appropriate internal identifiers. Automating steps in the manual data entry process can improve curation efforts for major biological databases. The functionality of CycTools is demonstrated by transferring GO term annotations from MaizeCyc to matching proteins in CornCyc, both maize metabolic pathway databases available at MaizeGDB, and by creating strain specific databases for metabolic engineering.

  5. Can computer assistance improve the clinical and functional scores in total knee arthroplasty?

    Hernández-Vaquero, Daniel; Suarez-Vazquez, Abelardo; Iglesias-Fernandez, Susana

    2011-12-01

    Surgical navigation in TKA facilitates better alignment; however, it is unclear whether improved alignment alters clinical evolution and midterm and long-term complication rates. We determined the alignment differences between patients with standard, manual, jig-based TKAs and patients with navigation-based TKAs, and whether any differences would modify function, implant survival, and/or complications. We retrospectively reviewed 97 patients (100 TKAs) undergoing TKAs for minimal preoperative deformities. Fifty TKAs were performed with an image-free surgical navigation system and the other 50 with a standard technique. We compared femoral angle (FA), tibial angle (TA), and femorotibial angle (FTA) and determined whether any differences altered clinical or functional scores, as measured by the Knee Society Score (KSS), or complications. Seventy-three patients (75 TKAs) had a minimum followup of 8 years (mean, 8.3 years; range, 8-9.1 years). All patients included in the surgical navigation group had a FTA between 177° and 182º. We found no differences in the KSS or implant survival between the two groups and no differences in complication rates, although more complications occurred in the standard technique group (seven compared with two in the surgical navigation group). In the midterm, we found no difference in functional and clinical scores or implant survival between TKAs performed with and without the assistance of a navigation system. Level II, therapeutic study. See the Guidelines online for a complete description of levels of evidence.

  6. Assessment of left atrial volume and function: a comparative study between echocardiography, magnetic resonance imaging and multi slice computed tomography.

    Kühl, J Tobias; Lønborg, Jacob; Fuchs, Andreas; Andersen, Mads J; Vejlstrup, Niels; Kelbæk, Henning; Engstrøm, Thomas; Møller, Jacob E; Kofoed, Klaus F

    2012-06-01

    Measurement of left atrial (LA) maximal volume (LA(max)) using two-dimensional transthoracic echocardiography (TTE) provides prognostic information in several cardiac diseases. However, the relationship between LA(max) and LA function is poorly understood and TTE is less well suited for measuring dynamic LA volume changes. Conversely, cardiac magnetic resonance imaging (CMR) and multi-slice computed tomography (MSCT) appears more appropriate for such measures. We sought to determine the relationship between LA size assessed with TTE and LA size and function assessed with CMR and MSCT. Fifty-four patients were examined 3 months post myocardial infarction with echocardiography, CMR and MSCT. Left atrial volumes and LA reservoir function were assessed by TTE. LA time-volume curves were determined and LA reservoir function (cyclic change and fractional change), passive emptying function (reservoir volume) and pump function (left atrial ejection fraction-LAEF) were derived using CMR and MSCT. Left atrial fractional change and left atrial ejection fraction (LAEF) determined with CMR and MSCT were unrelated to LA(max) enlargement by echocardiography (P = NS). There was an overall good agreement between CMR and MSCT, with a small to moderate bias in LA(max) (4.9 ± 10.4 ml), CC (3.1 ± 9.1 ml) and reservoir volume (3.4 ± 9.1 ml). TTE underestimates LA(max) with up to 32% compared with CMR and MSCT (P fractional change and LAEF is not significantly related to LA(max) measured by TTE. TTE systematically underestimated LA volumes, whereas there are good agreements between MSCT and CMR for volumetric and functional properties.

  7. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  8. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

    Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

    2016-11-01

    High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

  9. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  10. Cat Swarm Optimization Based Functional Link Artificial Neural Network Filter for Gaussian Noise Removal from Computed Tomography Images

    M. Kumar

    2016-01-01

    Full Text Available Gaussian noise is one of the dominant noises, which degrades the quality of acquired Computed Tomography (CT image data. It creates difficulties in pathological identification or diagnosis of any disease. Gaussian noise elimination is desirable to improve the clarity of a CT image for clinical, diagnostic, and postprocessing applications. This paper proposes an evolutionary nonlinear adaptive filter approach, using Cat Swarm Functional Link Artificial Neural Network (CS-FLANN to remove the unwanted noise. The structure of the proposed filter is based on the Functional Link Artificial Neural Network (FLANN and the Cat Swarm Optimization (CSO is utilized for the selection of optimum weight of the neural network filter. The applied filter has been compared with the existing linear filters, like the mean filter and the adaptive Wiener filter. The performance indices, such as peak signal to noise ratio (PSNR, have been computed for the quantitative analysis of the proposed filter. The experimental evaluation established the superiority of the proposed filtering technique over existing methods.

  11. A computer-assisted test for the electrophysiological and psychophysical measurement of dynamic visual function based on motion contrast.

    Wist, E R; Ehrenstein, W H; Schrauf, M; Schraus, M

    1998-03-13

    A new test is described that allows for electrophysiological and psychophysical measurement of visual function based on motion contrast. In a computer-generated random-dot display, completely camouflaged Landolt rings become visible only when dots within the target area are moved briefly while those of the background remain stationary. Thus, detection of contours and the location of the gap in the ring rely on motion contrast (form-from-motion) instead of luminance contrast. A standard version of this test has been used to assess visual performance in relation to age, in screening professional groups (truck drivers) and in clinical groups (glaucoma patients). Aside from this standard version, the computer program easily allows for various modifications. These include the option of a synchronizing trigger signal to allow for recording of time-locked motion-onset visual-evoked responses, the reversal of target and background motion, and the displacement of random-dot targets across stationary backgrounds. In all instances, task difficulty is manipulated by changing the percentage of moving dots within the target (or background). The present test offers a short, convenient method to probe dynamic visual functions relying on surprathreshold motion-contrast stimuli and complements other routine tests of form, contrast, depth, and color vision.

  12. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  13. Metabolic engineering of deinococcus radiodurans based on computational analysis and functional genomics

    Edwards, Jeremy, S.

    2005-02-02

    The objective of our work is to develop novel computational tools to analyze the Deinococcus radiodurans DNA repair pathways and the influence of the metabolic flux distribution on DNA repair. These tools will be applied to provide insights for metabolic engineering of strains capable of growing under nutrient poor conditions similar to those found in mixed contaminant sites of interest to the DOE. Over the entire grant period we accomplished all our specific aims and were also able to pursue new directions of research. Below, I will list the major accomplishments over the previous 3 years. (1) Performed Monte Carlo Simulations of RecA Mediated Pairing of Homologous DNA Molecules. (2) Developed a statistical approach to study the gene expression data from D. radiodurans. We have been studying the data from John Batista's. (3) Developed an expression profiling technology to generate very accurate and precise expression data. We followed up on results from John Batista's group using this approach. (4) Developed and put online a database for metabolic reconstructions. (5) We have developed and applied new Monte Carlo algorithms that are optimized for studying biological systems. (6) We developed a flux balance model for the D. radiodurans metabolic network

  14. Study of EBSD Experiment Parameters Influence on Computation of Polycrystalline Pole Figures and Orientation Distribution Function

    Antonova Anastasia O.

    2016-01-01

    Full Text Available Mathematical model for a polycrystalline specimen and EBSD experiment is proposed. As the measurement parameters, the scanning step and the threshold disorientation angle are considered. To study the impact of the measurement parameters Pole Figures and Orientation Distribution Function of model specimen and corresponding ones, calculated from model EBSD measurements, are compared. The real EBSD experiment was also performed. The results of the model experiment are correlated with such detected in the real EBSD data. The most significant results are formulated in the given work.

  15. A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC

    Jackson, James; Dixon, Mark R

    2007-01-01

    The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078

  16. Functional and performance requirements of the next NOAA-Kasas City computer system

    Mosher, F. R.

    1985-01-01

    The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.

  17. Quantitative computed tomography of pulmonary emphysema and ventricular function in chronic obstructive pulmonary disease patients with pulmonary hypertension

    Huang, Yu San; Jaw, Fu Shan [Institute of Biomedical Engineering, College of Medicine and College of Engineering, National Taiwan University, Taipei (China); Chen, Jo Yu; Tai, Mei Hwa [Dept. of Medical Imaging, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei (China); Hsu, Hsao Hsun [Dept. of Surgery, National Taiwan University Hospital and National Taiwan University College of Medicine, Taipei (China)

    2014-12-15

    This study strived to evaluate the relationship between degree of pulmonary emphysema and cardiac ventricular function in chronic obstructive pulmonary disease (COPD) patients with pulmonary hypertension (PH) using electrocardiographic-gated multidetector computed tomography (CT). Lung transplantation candidates with the diagnosis of COPD and PH were chosen for the study population, and a total of 15 patients were included. The extent of emphysema is defined as the percentage of voxels below -910 Hounsfield units in the lung windows in whole lung CT without intravenous contrast. Heart function parameters were measured by electrocardiographic-gated CT angiography. Linear regression analysis was conducted to examine the associations between percent emphysema and heart function indicators. Significant correlations were found between percent emphysema and right ventricular (RV) measurements, including RV end-diastolic volume (R2 = 0.340, p = 0.023), RV stroke volume (R2 = 0.406, p = 0.011), and RV cardiac output (R2 = 0.382, p = 0.014); the correlations between percent emphysema and left ventricular function indicators were not observed. The study revealed that percent emphysema is correlated with RV dysfunction among COPD patients with PH. Based on our findings, percent emphysema can be considered for use as an indicator to predict the severity of right ventricular dysfunction among COPD patients.

  18. Quantitative computed tomography of pulmonary emphysema and ventricular function in chronic obstructive pulmonary disease patients with pulmonary hypertension.

    Huang, Yu-Sen; Hsu, Hsao-Hsun; Chen, Jo-Yu; Tai, Mei-Hwa; Jaw, Fu-Shan; Chang, Yeun-Chung

    2014-01-01

    This study strived to evaluate the relationship between degree of pulmonary emphysema and cardiac ventricular function in chronic obstructive pulmonary disease (COPD) patients with pulmonary hypertension (PH) using electrocardiographic-gated multidetector computed tomography (CT). Lung transplantation candidates with the diagnosis of COPD and PH were chosen for the study population, and a total of 15 patients were included. The extent of emphysema is defined as the percentage of voxels below -910 Hounsfield units in the lung windows in whole lung CT without intravenous contrast. Heart function parameters were measured by electrocardiographic-gated CT angiography. Linear regression analysis was conducted to examine the associations between percent emphysema and heart function indicators. Significant correlations were found between percent emphysema and right ventricular (RV) measurements, including RV end-diastolic volume (R(2) = 0.340, p = 0.023), RV stroke volume (R(2) = 0.406, p = 0.011), and RV cardiac output (R(2) = 0.382, p = 0.014); the correlations between percent emphysema and left ventricular function indicators were not observed. The study revealed that percent emphysema is correlated with RV dysfunction among COPD patients with PH. Based on our findings, percent emphysema can be considered for use as an indicator to predict the severity of right ventricular dysfunction among COPD patients.

  19. Functional Relevance of Coronary Artery Disease by Cardiac Magnetic Resonance and Cardiac Computed Tomography: Myocardial Perfusion and Fractional Flow Reserve

    Gianluca Pontone

    2015-01-01

    Full Text Available Coronary artery disease (CAD is one of the leading causes of morbidity and mortality and it is responsible for an increasing resource burden. The identification of patients at high risk for adverse events is crucial to select those who will receive the greatest benefit from revascularization. To this aim, several non-invasive functional imaging modalities are usually used as gatekeeper to invasive coronary angiography, but the diagnostic yield of elective invasive coronary angiography remains unfortunately low. Stress myocardial perfusion imaging by cardiac magnetic resonance (stress-CMR has emerged as an accurate technique for diagnosis and prognostic stratification of the patients with known or suspected CAD thanks to high spatial and temporal resolution, absence of ionizing radiation, and the multiparametric value including the assessment of cardiac anatomy, function, and viability. On the other side, cardiac computed tomography (CCT has emerged as unique technique providing coronary arteries anatomy and more recently, due to the introduction of stress-CCT and noninvasive fractional flow reserve (FFR-CT, functional relevance of CAD in a single shot scan. The current review evaluates the technical aspects and clinical experience of stress-CMR and CCT in the evaluation of functional relevance of CAD discussing the strength and weakness of each approach.

  20. Quantitative computed tomography of pulmonary emphysema and ventricular function in chronic obstructive pulmonary disease patients with pulmonary hypertension

    Huang, Yu San; Jaw, Fu Shan; Chen, Jo Yu; Tai, Mei Hwa; Hsu, Hsao Hsun

    2014-01-01

    This study strived to evaluate the relationship between degree of pulmonary emphysema and cardiac ventricular function in chronic obstructive pulmonary disease (COPD) patients with pulmonary hypertension (PH) using electrocardiographic-gated multidetector computed tomography (CT). Lung transplantation candidates with the diagnosis of COPD and PH were chosen for the study population, and a total of 15 patients were included. The extent of emphysema is defined as the percentage of voxels below -910 Hounsfield units in the lung windows in whole lung CT without intravenous contrast. Heart function parameters were measured by electrocardiographic-gated CT angiography. Linear regression analysis was conducted to examine the associations between percent emphysema and heart function indicators. Significant correlations were found between percent emphysema and right ventricular (RV) measurements, including RV end-diastolic volume (R2 = 0.340, p = 0.023), RV stroke volume (R2 = 0.406, p = 0.011), and RV cardiac output (R2 = 0.382, p = 0.014); the correlations between percent emphysema and left ventricular function indicators were not observed. The study revealed that percent emphysema is correlated with RV dysfunction among COPD patients with PH. Based on our findings, percent emphysema can be considered for use as an indicator to predict the severity of right ventricular dysfunction among COPD patients.

  1. A computational approach for functional mapping of quantitative trait loci that regulate thermal performance curves.

    John Stephen Yap

    2007-06-01

    Full Text Available Whether and how thermal reaction norm is under genetic control is fundamental to understand the mechanistic basis of adaptation to novel thermal environments. However, the genetic study of thermal reaction norm is difficult because it is often expressed as a continuous function or curve. Here we derive a statistical model for dissecting thermal performance curves into individual quantitative trait loci (QTL with the aid of a genetic linkage map. The model is constructed within the maximum likelihood context and implemented with the EM algorithm. It integrates the biological principle of responses to temperature into a framework for genetic mapping through rigorous mathematical functions established to describe the pattern and shape of thermal reaction norms. The biological advantages of the model lie in the decomposition of the genetic causes for thermal reaction norm into its biologically interpretable modes, such as hotter-colder, faster-slower and generalist-specialist, as well as the formulation of a series of hypotheses at the interface between genetic actions/interactions and temperature-dependent sensitivity. The model is also meritorious in statistics because the precision of parameter estimation and power of QTLdetection can be increased by modeling the mean-covariance structure with a small set of parameters. The results from simulation studies suggest that the model displays favorable statistical properties and can be robust in practical genetic applications. The model provides a conceptual platform for testing many ecologically relevant hypotheses regarding organismic adaptation within the Eco-Devo paradigm.

  2. IRFK2D: a computer program for simulating intrinsic random functions of order k

    Pardo-Igúzquiza, Eulogio; Dowd, Peter A.

    2003-07-01

    IRFK2D is an ANSI Fortran-77 program that generates realizations of an intrinsic function of order k (with k equal to 0, 1 or 2) with a permissible polynomial generalized covariance model. The realizations may be non-conditional or conditioned to the experimental data. The turning bands method is used to generate realizations in 2D and 3D from simulations of an intrinsic random function of order k along lines that span the 2D or 3D space. The program generates two output files, the first containing the simulated values and the second containing the theoretical generalized variogram for different directions together with the theoretical model. The experimental variogram is calculated from the simulated values while the theoretical variogram is the specified generalized covariance model. The generalized variogram is used to assess the quality of the simulation as measured by the extent to which the generalized covariance is reproduced by the simulation. The examples given in this paper indicate that IRFK2D is an efficient implementation of the methodology.

  3. Comparison of measured and computed phase functions of individual tropospheric ice crystals

    Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin

    2016-07-01

    Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert.

  4. Computational Approaches Reveal New Insights into Regulation and Function of Non; coding RNAs and their Targets

    Alam, Tanvir

    2016-11-28

    Regulation and function of protein-coding genes are increasingly well-understood, but no comparable evidence exists for non-coding RNA (ncRNA) genes, which appear to be more numerous than protein-coding genes. We developed a novel machine-learning model to distinguish promoters of long ncRNA (lncRNA) genes from those of protein-coding genes. This represents the first attempt to make this distinction based on properties of the associated gene promoters. From our analyses, several transcription factors (TFs), which are known to be regulated by lncRNAs, also emerged as potential global regulators of lncRNAs, suggesting that lncRNAs and TFs may participate in bidirectional feedback regulatory network. Our results also raise the possibility that, due to the historical dependence on protein-coding gene in defining the chromatin states of active promoters, an adjustment of these chromatin signature profiles to incorporate lncRNAs is warranted in the future. Secondly, we developed a novel method to infer functions for lncRNA and microRNA (miRNA) transcripts based on their transcriptional regulatory networks in 119 tissues and 177 primary cells of human. This method for the first time combines information of cell/tissueVspecific expression of a transcript and the TFs and transcription coVfactors (TcoFs) that control activation of that transcript. Transcripts were annotated using statistically enriched GO terms, pathways and diseases across cells/tissues and associated knowledgebase (FARNA) is developed. FARNA, having the most comprehensive function annotation of considered ncRNAs across the widest spectrum of cells/tissues, has a potential to contribute to our understanding of ncRNA roles and their regulatory mechanisms in human. Thirdly, we developed a novel machine-learning model to identify LD motif (a protein interaction motif) of paxillin, a ncRNA target that is involved in cell motility and cancer metastasis. Our recognition model identified new proteins not

  5. Comparison of x ray computed tomography number to proton relative linear stopping power conversion functions using a standard phantom.

    Moyers, M F

    2014-06-01

    Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote standardization between facilities. Although it

  6. Machine Learning Classification to Identify the Stage of Brain-Computer Interface Therapy for Stroke Rehabilitation Using Functional Connectivity

    Rosaleena Mohanty

    2018-05-01

    Full Text Available Interventional therapy using brain-computer interface (BCI technology has shown promise in facilitating motor recovery in stroke survivors; however, the impact of this form of intervention on functional networks outside of the motor network specifically is not well-understood. Here, we investigated resting-state functional connectivity (rs-FC in stroke participants undergoing BCI therapy across stages, namely pre- and post-intervention, to identify discriminative functional changes using a machine learning classifier with the goal of categorizing participants into one of the two therapy stages. Twenty chronic stroke participants with persistent upper-extremity motor impairment received neuromodulatory training using a closed-loop neurofeedback BCI device, and rs-functional MRI (rs-fMRI scans were collected at four time points: pre-, mid-, post-, and 1 month post-therapy. To evaluate the peak effects of this intervention, rs-FC was analyzed from two specific stages, namely pre- and post-therapy. In total, 236 seeds spanning both motor and non-motor regions of the brain were computed at each stage. A univariate feature selection was applied to reduce the number of features followed by a principal component-based data transformation used by a linear binary support vector machine (SVM classifier to classify each participant into a therapy stage. The SVM classifier achieved a cross-validation accuracy of 92.5% using a leave-one-out method. Outside of the motor network, seeds from the fronto-parietal task control, default mode, subcortical, and visual networks emerged as important contributors to the classification. Furthermore, a higher number of functional changes were observed to be strengthening from the pre- to post-therapy stage than the ones weakening, both of which involved motor and non-motor regions of the brain. These findings may provide new evidence to support the potential clinical utility of BCI therapy as a form of stroke

  7. Technical and functional analysis of Spanish windmills: 3D modeling, computational-fluid-dynamics simulation and finite-element analysis

    Rojas-Sola, José Ignacio; Bouza-Rodríguez, José Benito; Menéndez-Díaz, Agustín

    2016-01-01

    Highlights: • Technical and functional analysis of the two typologies of windmills in Spain. • Spatial distribution of velocities and pressures by computational-fluid dynamics (CFD). • Finite-element analysis (FEA) of the rotors of these two types of windmills. • Validation of the operative functionality of these windmills. - Abstract: A detailed study has been made of the two typologies of windmills in Spain, specifically the rectangular-bladed type, represented by the windmill ‘Sardinero’, located near the town of Campo de Criptana (Ciudad Real province, Spain) and the type with triangular sails (lateens), represented by the windmill ‘San Francisco’, in the town of Vejer de la Frontera (Cádiz province, Spain). For this, an ad hoc research methodology has been applied on the basis of three aspects: three-dimensional geometric modeling, analysis by computational-fluid dynamics (CFD), and finite-element analysis (FEA). The results found with the CFD technique show the correct functioning of the two windmills in relation to the spatial distribution of the wind velocities and pressures to which each is normally exposed (4–7 m/s in the case of ‘Sardinero’, and 5–11 for ‘San Francisco’), thereby validating the operative functionality of both types. In addition, as a result of the FEA, the spatial distribution of stresses on the rotor has revealed that the greatest concentrations of these occurs in the teeth of the head wheel in ‘Sardinero’, reaching a value of 12 MPa, and at the base of the masts in the case of the ‘San Francisco’, with a value of 24 MPa. Also, this analysis evidences that simple, effective designs to reinforce the masts absorb a great concentration of stresses that would otherwise cause breakage. Furthermore, it was confirmed that the oak wood from which the rotors were made functioned properly, as the windmill never exceeded the maximum admissible working stress, demonstrating the effectiveness of the materials

  8. A Computational Model for the Automatic Diagnosis of Attention Deficit Hyperactivity Disorder Based on Functional Brain Volume

    Lirong Tan

    2017-09-01

    Full Text Available In this paper, we investigated the problem of computer-aided diagnosis of Attention Deficit Hyperactivity Disorder (ADHD using machine learning techniques. With the ADHD-200 dataset, we developed a Support Vector Machine (SVM model to classify ADHD patients from typically developing controls (TDCs, using the regional brain volumes as predictors. Conventionally, the volume of a brain region was considered to be an anatomical feature and quantified using structural magnetic resonance images. One major contribution of the present study was that we had initially proposed to measure the regional brain volumes using fMRI images. Brain volumes measured from fMRI images were denoted as functional volumes, which quantified the volumes of brain regions that were actually functioning during fMRI imaging. We compared the predictive power of functional volumes with that of regional brain volumes measured from anatomical images, which were denoted as anatomical volumes. The former demonstrated higher discriminative power than the latter for the classification of ADHD patients vs. TDCs. Combined with our two-step feature selection approach which integrated prior knowledge with the recursive feature elimination (RFE algorithm, our SVM classification model combining functional volumes and demographic characteristics achieved a balanced accuracy of 67.7%, which was 16.1% higher than that of a relevant model published previously in the work of Sato et al. Furthermore, our classifier highlighted 10 brain regions that were most discriminative in distinguishing between ADHD patients and TDCs. These 10 regions were mainly located in occipital lobe, cerebellum posterior lobe, parietal lobe, frontal lobe, and temporal lobe. Our present study using functional images will likely provide new perspectives about the brain regions affected by ADHD.

  9. Reference absolute and indexed values for left and right ventricular volume, function and mass from cardiac computed tomography

    Stojanovska, Jadranka; Prasitdumrong, Hutsaya; Patel, Smita; Sundaram, Baskaran; Gross, Barry H.; Yilmaz, Zeynep N.; Kazerooni, Ella A.

    2014-01-01

    Left ventricular (LV) and right ventricular (RV) volumetric and functional parameters are important biomarkers for morbidity and mortality in patients with heart failure. To retrospectively determine reference mean values of LV and RV volume, function and mass normalised by age, gender and body surface area (BSA) from retrospectively electrocardiographically gated 64-slice cardiac computed tomography (CCT) by using automated analysis software in healthy adults. The study was approved by the institutional review board with a waiver of informed consent. Seventy-four healthy subjects (49% female, mean age 49.6±11) free of hypertension and hypercholesterolaemia with a normal CCT formed the study population. Analyses of LV and RV volume (end-diastolic, end-systolic and stroke volumes), function (ejection fraction), LV mass and inter-rater reproducibility were performed with commercially available analysis software capable of automated contour detection. General linear model analysis was performed to assess statistical significance by age group after adjustment for gender and BSA. Bland–Altman analysis assessed the inter-rater agreement. The reference range for LV and RV volume, function, and LV mass was normalised to age, gender and BSA. Statistically significant differences were noted between genders in both LV mass and RV volume (P-value<0.0001). Age, in concert with gender, was associated with significant differences in RV end-diastolic volume and LV ejection fraction (P-values 0.027 and 0.03). Bland–Altman analysis showed acceptable limits of agreement (±1.5% for ejection fraction) without systematic error. LV and RV volume, function and mass normalised to age, gender and BSA can be reported from CCT datasets, providing additional information important for patient management.

  10. Time-dependent density-functional theory in massively parallel computer architectures: the OCTOPUS project.

    Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A L

    2012-06-13

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.

  11. Correlation of grading of pulmonary emphysema by computed tomography to pulmonary function

    Yamagishi, Masahiko; Mori, Masaki; Hirai, Hideyuki; Mori, Yuji; Koba, Hiroyuki; Suzuki, Akira

    1988-01-01

    We studied the CT findings of 17 emphysema patients with special reference to the extent of emphysematous changes. Characteristic CT findings were low-attenuation area (LAA) and vascular abnormality and the appearance of various images on CT. To assess the extent of emphysematous changes, we classified the CT images into 4 grades based on the distribution and size of LAAs. As the grades progressed, the distribution and size of LAAs became wider and larger and vascular abnormalities were clearly evident. Although this CT-grading is a semiquantitative method, it is simple to use and gives information on the approximate extent of disease. This CT-grading was also used to show pulmonary function. The RV/TLC and expiratory flow showed a tendency to be impaired in Grade IV and the diffusion capacity was impaired parallel to CT-grading. CT is able to demonstrate the presence and distribution of LAAs noninvasively. Therefore it is considered that CT is useful for clinical diagnosis and the assessment of the extent of pulmonary emphysema. (author)

  12. Aspects of Text Mining From Computational Semiotics to Systemic Functional Hypertexts

    Alexander Mehler

    2001-05-01

    Full Text Available The significance of natural language texts as the prime information structure for the management and dissemination of knowledge in organisations is still increasing. Making relevant documents available depending on varying tasks in different contexts is of primary importance for any efficient task completion. Implementing this demand requires the content based processing of texts, which enables to reconstruct or, if necessary, to explore the relationship of task, context and document. Text mining is a technology that is suitable for solving problems of this kind. In the following, semiotic aspects of text mining are investigated. Based on the primary object of text mining - natural language lexis - the specific complexity of this class of signs is outlined and requirements for the implementation of text mining procedures are derived. This is done with reference to text linkage introduced as a special task in text mining. Text linkage refers to the exploration of implicit, content based relations of texts (and their annotation as typed links in corpora possibly organised as hypertexts. In this context, the term systemic functional hypertext is introduced, which distinguishes genre and register layers for the management of links in a poly-level hypertext system.

  13. Extended Nijboer-Zernike approach for the computation of optical point-spread functions.

    Janssen, Augustus J E M

    2002-05-01

    New Bessel-series representations for the calculation of the diffraction integral are presented yielding the point-spread function of the optical system, as occurs in the Nijboer-Zernike theory of aberrations. In this analysis one can allow an arbitrary aberration and a defocus part. The representations are presented in full detail for the cases of coma and astigmatism. The analysis leads to stably converging results in the case of large aberration or defocus values, while the applicability of the original Nijboer-Zernike theory is limited mainly to wave-front deviations well below the value of one wavelength. Because of its intrinsic speed, the analysis is well suited to supplement or to replace numerical calculations that are currently used in the fields of (scanning) microscopy, lithography, and astronomy. In a companion paper [J. Opt. Soc. Am. A 19, 860 (2002)], physical interpretations and applications in a lithographic context are presented, a convergence analysis is given, and a comparison is made with results obtained by using a numerical package.

  14. Chemical, computational and functional insights into the chemical stability of the Hedgehog pathway inhibitor GANT61.

    Calcaterra, Andrea; Iovine, Valentina; Botta, Bruno; Quaglio, Deborah; D'Acquarica, Ilaria; Ciogli, Alessia; Iazzetti, Antonia; Alfonsi, Romina; Lospinoso Severini, Ludovica; Infante, Paola; Di Marcotullio, Lucia; Mori, Mattia; Ghirga, Francesca

    2018-12-01

    This work aims at elucidating the mechanism and kinetics of hydrolysis of GANT61, the first and most-widely used inhibitor of the Hedgehog (Hh) signalling pathway that targets Glioma-associated oncogene homologue (Gli) proteins, and at confirming the chemical nature of its bioactive form. GANT61 is poorly stable under physiological conditions and rapidly hydrolyses into an aldehyde species (GANT61-A), which is devoid of the biological activity against Hh signalling, and a diamine derivative (GANT61-D), which has shown inhibition of Gli-mediated transcription. Here, we combined chemical synthesis, NMR spectroscopy, analytical studies, molecular modelling and functional cell assays to characterise the GANT61 hydrolysis pathway. Our results show that GANT61-D is the bioactive form of GANT61 in NIH3T3 Shh-Light II cells and SuFu -/- mouse embryonic fibroblasts, and clarify the structural requirements for GANT61-D binding to Gli1. This study paves the way to the design of GANT61 derivatives with improved potency and chemical stability.

  15. Time-dependent density-functional theory in massively parallel computer architectures: the octopus project

    Andrade, Xavier; Alberdi-Rodriguez, Joseba; Strubbe, David A.; Oliveira, Micael J. T.; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Louie, Steven G.; Aspuru-Guzik, Alán; Rubio, Angel; Marques, Miguel A. L.

    2012-06-01

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures.

  16. Time-dependent density-functional theory in massively parallel computer architectures: the octopus project

    Andrade, Xavier; Aspuru-Guzik, Alán; Alberdi-Rodriguez, Joseba; Rubio, Angel; Strubbe, David A; Louie, Steven G; Oliveira, Micael J T; Nogueira, Fernando; Castro, Alberto; Muguerza, Javier; Arruabarrena, Agustin; Marques, Miguel A L

    2012-01-01

    Octopus is a general-purpose density-functional theory (DFT) code, with a particular emphasis on the time-dependent version of DFT (TDDFT). In this paper we present the ongoing efforts to achieve the parallelization of octopus. We focus on the real-time variant of TDDFT, where the time-dependent Kohn-Sham equations are directly propagated in time. This approach has great potential for execution in massively parallel systems such as modern supercomputers with thousands of processors and graphics processing units (GPUs). For harvesting the potential of conventional supercomputers, the main strategy is a multi-level parallelization scheme that combines the inherent scalability of real-time TDDFT with a real-space grid domain-partitioning approach. A scalable Poisson solver is critical for the efficiency of this scheme. For GPUs, we show how using blocks of Kohn-Sham states provides the required level of data parallelism and that this strategy is also applicable for code optimization on standard processors. Our results show that real-time TDDFT, as implemented in octopus, can be the method of choice for studying the excited states of large molecular systems in modern parallel architectures. (topical review)

  17. A Computational Analysis of the Function of Three Inhibitory Cell Types in Contextual Visual Processing

    Jung H. Lee

    2017-04-01

    Full Text Available Most cortical inhibitory cell types exclusively express one of three genes, parvalbumin, somatostatin and 5HT3a. We conjecture that these three inhibitory neuron types possess distinct roles in visual contextual processing based on two observations. First, they have distinctive synaptic sources and targets over different spatial extents and from different areas. Second, the visual responses of cortical neurons are affected not only by local cues, but also by visual context. We use modeling to relate structural information to function in primary visual cortex (V1 of the mouse, and investigate their role in contextual visual processing. Our findings are three-fold. First, the inhibition mediated by parvalbumin positive (PV cells mediates local processing and could underlie their role in boundary detection. Second, the inhibition mediated by somatostatin-positive (SST cells facilitates longer range spatial competition among receptive fields. Third, non-specific top-down modulation to interneurons expressing vasoactive intestinal polypeptide (VIP, a subclass of 5HT3a neurons, can selectively enhance V1 responses.

  18. Nuclear event time histories and computed site transfer functions for locations in the Los Angeles region

    Rogers, A.M.; Covington, P.A.; Park, R.B.; Borcherdt, R.D.; Perkins, D.M.

    1980-01-01

    This report presents a collection of Nevada Test Site (NTS) nuclear explosion recordings obtained at sites in the greater Los Angeles, Calif., region. The report includes ground velocity time histories, as well as, derived site transfer functions. These data have been collected as part of a study to evaluate the validity of using low-level ground motions to predict the frequency-dependent response of a site during an earthquake. For this study 19 nuclear events were recorded at 98 separate locations. Some of these sites have recorded more than one of the nuclear explosions, and, consequently, there are a total of 159, three-component station records. The location of all the recording sites are shown in figures 1–5, the station coordinates and abbreviations are given in table 1. The station addresses are listed in table 2, and the nuclear explosions that were recorded are listed in table 3. The recording sites were chosen on the basis of three criteria: (1) that the underlying geological conditions were representative of conditions over significant areas of the region, (2) that the site was the location of a strong-motion recording of the 1971 San Fernando earthquake, or (3) that more complete geographical coverage was required in that location.

  19. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  20. Anxiety, Family Functioning and Neuroendocrine Biomarkers in Obese Children

    Inês Pinto; Simon Wilkinson; Daniel Virella; Marta Alves; Conceição Calhau; Rui Coelho

    2017-01-01

    The project was supported by the Research Support Scheme of the FMUP/doctoral program, grant no PEst-OE/SAU/UI0038/2011 and by FCT, SFRH/SINTD/60115/2009, FSE-UE. Introduction: This observational study explores potential links between obese children’s cortisol, and parental mental state, family functioning, and the children’s symptoms of anxiety and depression. Material and Methods: A non-random sample of 104 obese children (55 boys), mean age 10.9 years (standard deviation 1.76), was recr...

  1. Improving executive functioning in children with ADHD: training multiple executive functions within the context of a computer game. a randomized double-blind placebo controlled trial.

    Sebastiaan Dovis

    Full Text Available Executive functions (EFs training interventions aimed at ADHD-symptom reduction have yielded mixed results. Generally, these interventions focus on training a single cognitive domain (e.g., working memory [WM], inhibition, or cognitive-flexibility. However, evidence suggests that most children with ADHD show deficits on multiple EFs, and that these EFs are largely related to different brain regions. Therefore, training multiple EFs might be a potentially more effective strategy to reduce EF-related ADHD symptoms.Eighty-nine children with a clinical diagnosis of ADHD (aged 8-12 were randomized to either a full-active-condition where visuospatial WM, inhibition and cognitive-flexibility were trained, a partially-active-condition where inhibition and cognitive-flexibility were trained and the WM-training task was presented in placebo-mode, or to a full placebo-condition. Short-term and long-term (3-months effects of this gamified, 25-session, home-based computer-training were evaluated on multiple outcome domains.During training compliance was high (only 3% failed to meet compliance criteria. After training, only children in the full-active condition showed improvement on measures of visuospatial short-term-memory (STM and WM. Inhibitory performance and interference control only improved in the full-active- and the partially-active condition. No Treatment-condition x Time interactions were found for cognitive-flexibility, verbal WM, complex-reasoning, nor for any parent-, teacher-, or child-rated ADHD behaviors, EF-behaviors, motivational behaviors, or general problem behaviors. Nonetheless, almost all measures showed main Time-effects, including the teacher-ratings.Improvements on inhibition and visuospatial STM and WM were specifically related to the type of treatment received. However, transfer to untrained EFs and behaviors was mostly nonspecific (i.e., only interference control improved exclusively in the two EF training conditions. As such

  2. CDF-XL: computing cumulative distribution functions of reaction time data in Excel.

    Houghton, George; Grange, James A

    2011-12-01

    In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE

  4. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules F1-F8

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with eight of the functional modules in the code. Those are: BONAMI - resonance self-shielding by the Bondarenko method; NITAWL-II - SCALE system module for performing resonance shielding and working library production; XSDRNPM - a one-dimensional discrete-ordinates code for transport analysis; XSDOSE - a module for calculating fluxes and dose rates at points outside a shield; KENO IV/S - an improved monte carlo criticality program; COUPLE; ORIGEN-S - SCALE system module to calculate fuel depletion, actinide transmutation, fission product buildup and decay, and associated radiation source terms; ICE.

  5. Spatial Computation

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  6. Accuracy of the microcanonical Lanczos method to compute real-frequency dynamical spectral functions of quantum models at finite temperatures

    Okamoto, Satoshi; Alvarez, Gonzalo; Dagotto, Elbio; Tohyama, Takami

    2018-04-01

    We examine the accuracy of the microcanonical Lanczos method (MCLM) developed by Long et al. [Phys. Rev. B 68, 235106 (2003), 10.1103/PhysRevB.68.235106] to compute dynamical spectral functions of interacting quantum models at finite temperatures. The MCLM is based on the microcanonical ensemble, which becomes exact in the thermodynamic limit. To apply the microcanonical ensemble at a fixed temperature, one has to find energy eigenstates with the energy eigenvalue corresponding to the internal energy in the canonical ensemble. Here, we propose to use thermal pure quantum state methods by Sugiura and Shimizu [Phys. Rev. Lett. 111, 010401 (2013), 10.1103/PhysRevLett.111.010401] to obtain the internal energy. After obtaining the energy eigenstates using the Lanczos diagonalization method, dynamical quantities are computed via a continued fraction expansion, a standard procedure for Lanczos-based numerical methods. Using one-dimensional antiferromagnetic Heisenberg chains with S =1 /2 , we demonstrate that the proposed procedure is reasonably accurate, even for relatively small systems.

  7. Accuracy of the microcanonical Lanczos method to compute real-frequency dynamical spectral functions of quantum models at finite temperatures.

    Okamoto, Satoshi; Alvarez, Gonzalo; Dagotto, Elbio; Tohyama, Takami

    2018-04-01

    We examine the accuracy of the microcanonical Lanczos method (MCLM) developed by Long et al. [Phys. Rev. B 68, 235106 (2003)PRBMDO0163-182910.1103/PhysRevB.68.235106] to compute dynamical spectral functions of interacting quantum models at finite temperatures. The MCLM is based on the microcanonical ensemble, which becomes exact in the thermodynamic limit. To apply the microcanonical ensemble at a fixed temperature, one has to find energy eigenstates with the energy eigenvalue corresponding to the internal energy in the canonical ensemble. Here, we propose to use thermal pure quantum state methods by Sugiura and Shimizu [Phys. Rev. Lett. 111, 010401 (2013)PRLTAO0031-900710.1103/PhysRevLett.111.010401] to obtain the internal energy. After obtaining the energy eigenstates using the Lanczos diagonalization method, dynamical quantities are computed via a continued fraction expansion, a standard procedure for Lanczos-based numerical methods. Using one-dimensional antiferromagnetic Heisenberg chains with S=1/2, we demonstrate that the proposed procedure is reasonably accurate, even for relatively small systems.

  8. Sensitivity to Social Contingency in Adults with High-Functioning Autism during Computer-Mediated Embodied Interaction.

    Zapata-Fonseca, Leonardo; Froese, Tom; Schilbach, Leonhard; Vogeley, Kai; Timmermans, Bert

    2018-02-08

    Autism Spectrum Disorder (ASD) can be understood as a social interaction disorder. This makes the emerging "second-person approach" to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA), using the minimalistic human-computer interface paradigm known as "perceptual crossing" (PC). We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other's responsiveness to one's own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.

  9. Assessment of left ventricular function using 201Tl electrocardiogram-gated myocardial single photon emission computed tomography

    Nishikubo, Naotsugu; Tamai, Hiroyuki

    2013-01-01

    Advances in computed tomography (CT) technology make it possible to obtain left ventricular wall motion using 3D reconstruction. In this study, we compared the images obtained from CT and 201 Tl electrocardiogram (ECG) gated single photon emission computed tomography (SPECT). In 20 patients with ischemic heart disease, we performed 201 Tl ECG gated SPECT (GE Healthcare Millennium VG) and ECG gated CT (Philips Medical Systems Brilliance iCT) to evaluate of left ventricular wall motion during the resting phase. In SPECT, left ventricular images were reconstructed using quantitative gated SPECT (QGS) software. In CT, the images were reconstructed using Virtual Place (AZE Software). The left ventricle was classified into five regions (anterior, lateral, inferior, septal, and apical). The amplitude of the wall motion was classified into five grades according to AHA classification. The values of the wall motion were separately checked by two radiographers. Assessment of left ventricular function myocardial wall movement using the three-dimensional movie display with ECG gated myocardial SPECT data was in agreement with the evaluation by cardiac CT inspection, and corresponded with wall motion in 88 of all 100 segments. SPECT analysis has the same quantity as that of obtained from CT for evaluation of left ventricular wall motion. (author)

  10. Computing Wigner distributions and time correlation functions using the quantum thermal bath method: application to proton transfer spectroscopy.

    Basire, Marie; Borgis, Daniel; Vuilleumier, Rodolphe

    2013-08-14

    Langevin dynamics coupled to a quantum thermal bath (QTB) allows for the inclusion of vibrational quantum effects in molecular dynamics simulations at virtually no additional computer cost. We investigate here the ability of the QTB method to reproduce the quantum Wigner distribution of a variety of model potentials, designed to assess the performances and limits of the method. We further compute the infrared spectrum of a multidimensional model of proton transfer in the gas phase and in solution, using classical trajectories sampled initially from the Wigner distribution. It is shown that for this type of system involving large anharmonicities and strong nonlinear coupling to the environment, the quantum thermal bath is able to sample the Wigner distribution satisfactorily and to account for both zero point energy and tunneling effects. It leads to quantum time correlation functions having the correct short-time behavior, and the correct associated spectral frequencies, but that are slightly too overdamped. This is attributed to the classical propagation approximation rather than the generation of the quantized initial conditions themselves.

  11. Impaired Flexible Reward-Based Decision-Making in Binge Eating Disorder: Evidence from Computational Modeling and Functional Neuroimaging.

    Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz

    2017-02-01

    Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro-medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED.

  12. Sensitivity to Social Contingency in Adults with High-Functioning Autism during Computer-Mediated Embodied Interaction

    Leonardo Zapata-Fonseca

    2018-02-01

    Full Text Available Autism Spectrum Disorder (ASD can be understood as a social interaction disorder. This makes the emerging “second-person approach” to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA, using the minimalistic human-computer interface paradigm known as “perceptual crossing” (PC. We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other’s responsiveness to one’s own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.

  13. Fast neutron detection with germanium detectors: computation of response functions for the 692 keV inelastic scattering peak

    Fehrenbacher, G.; Meckbach, R.; Paretzke, H.G.

    1996-01-01

    The dependence of the shape of the right-sided broadening of the inelastic scattering peak at 692 keV in the pulse-height distribution measured with a Ge detector in fast neutron fields on the energy of the incident neutrons has been analyzed. A model incorporating the process contributing to the energy deposition that engender the peak, including the partitioning of the energy deposition by the Ge recoils, was developed. With a Monte Carlo code based on this model, the detector response associated with this peak was computed and compared with results of measurements with quasi-monoenergetic neutrons for energies between 0.88 and 2.1 MeV. A set of 80 response functions for neutron energies in the range from the reaction threshold at 0.7 to 6 MeV was computed, which will serve as a starting point for methods, which aim at obtaining information on the spectral distribution of fast neutron fields for this energy range from measurements with a Ge detector. (orig.)

  14. Computer Simulation Tests of Feedback Error Learning Controller with IDM and ISM for Functional Electrical Stimulation in Wrist Joint Control

    Takashi Watanabe

    2010-01-01

    Full Text Available Feedforward controller would be useful for hybrid Functional Electrical Stimulation (FES system using powered orthotic devices. In this paper, Feedback Error Learning (FEL controller for FES (FEL-FES controller was examined using an inverse statics model (ISM with an inverse dynamics model (IDM to realize a feedforward FES controller. For FES application, the ISM was tested in learning off line using training data obtained by PID control of very slow movements. Computer simulation tests in controlling wrist joint movements showed that the ISM performed properly in positioning task and that IDM learning was improved by using the ISM showing increase of output power ratio of the feedforward controller. The simple ISM learning method and the FEL-FES controller using the ISM would be useful in controlling the musculoskeletal system that has nonlinear characteristics to electrical stimulation and therefore is expected to be useful in applying to hybrid FES system using powered orthotic device.

  15. The investigation of brain-computer interface for motor imagery and execution using functional near-infrared spectroscopy

    Zhang, Zhen; Jiao, Xuejun; Xu, Fengang; Jiang, Jin; Yang, Hanjun; Cao, Yong; Fu, Jiahao

    2017-01-01

    Functional near-infrared spectroscopy (fNIRS), which can measure cortex hemoglobin activity, has been widely adopted in brain-computer interface (BCI). To explore the feasibility of recognizing motor imagery (MI) and motor execution (ME) in the same motion. We measured changes of oxygenated hemoglobin (HBO) and deoxygenated hemoglobin (HBR) on PFC and Motor Cortex (MC) when 15 subjects performing hand extension and finger tapping tasks. The mean, slope, quadratic coefficient and approximate entropy features were extracted from HBO as the input of support vector machine (SVM). For the four-class fNIRS-BCI classifiers, we realized 87.65% and 87.58% classification accuracy corresponding to hand extension and finger tapping tasks. In conclusion, it is effective for fNIRS-BCI to recognize MI and ME in the same motion.

  16. On The Computation Of The Best-fit Okada-type Tsunami Source

    Miranda, J. M. A.; Luis, J. M. F.; Baptista, M. A.

    2017-12-01

    The forward simulation of earthquake-induced tsunamis usually assumes that the initial sea surface elevation mimics the co-seismic deformation of the ocean bottom described by a simple "Okada-type" source (rectangular fault with constant slip in a homogeneous elastic half space). This approach is highly effective, in particular in far-field conditions. With this assumption, and a given set of tsunami waveforms recorded by deep sea pressure sensors and (or) coastal tide stations it is possible to deduce the set of parameters of the Okada-type solution that best fits a set of sea level observations. To do this, we build a "space of possible tsunami sources-solution space". Each solution consists of a combination of parameters: earthquake magnitude, length, width, slip, depth and angles - strike, rake, and dip. To constrain the number of possible solutions we use the earthquake parameters defined by seismology and establish a range of possible values for each parameter. We select the "best Okada source" by comparison of the results of direct tsunami modeling using the solution space of tsunami sources. However, direct tsunami modeling is a time-consuming process for the whole solution space. To overcome this problem, we use a precomputed database of Empirical Green Functions to compute the tsunami waveforms resulting from unit water sources and search which one best matches the observations. In this study, we use as a test case the Solomon Islands tsunami of 6 February 2013 caused by a magnitude 8.0 earthquake. The "best Okada" source is the solution that best matches the tsunami recorded at six DART stations in the area. We discuss the differences between the initial seismic solution and the final one obtained from tsunami data This publication received funding of FCT-project UID/GEO/50019/2013-Instituto Dom Luiz.

  17. Computation of a numerically satisfactory pair of solutions of the differential equation for conical functions of non-negative integer orders

    T.M. Dunster (Mark); A. Gil (Amparo); J. Segura (Javier); N.M. Temme (Nico)

    2014-01-01

    textabstractWe consider the problem of computing satisfactory pair of solutions of the differential equation for Legendre functions of non-negative integer order $\\mu$ and degree $-\\frac12+i\\tau$, where $\\tau$ is a non-negative real parameter. Solutions of this equation are the conical functions

  18. Studies on the zeros of Bessel functions and methods for their computation: 3. Some new works on monotonicity, convexity, and other properties

    Kerimov, M. K.

    2016-12-01

    This paper continues the study of real zeros of Bessel functions begun in the previous parts of this work (see M. K. Kerimov, Comput. Math. Math. Phys. 54 (9), 1337-1388 (2014); 56 (7), 1175-1208 (2016)). Some new results regarding the monotonicity, convexity, concavity, and other properties of zeros are described. Additionally, the zeros of q-Bessel functions are investigated.

  19. Global and regional left ventricular function: a comparison between gated SPECT, 2D echocardiography and multi-slice computed tomography

    Henneman, Maureen M.; Bax, Jeroen J.; Holman, Eduard R.; Schuijf, Joanne D.; Jukema, J.W.; Wall, Ernst E. van der; Stokkel, Marcel P.M.; Lamb, Hildo J.; Roos, Albert de

    2006-01-01

    Global and regional left ventricular (LV) function are important indicators of the cardiac status in patients with coronary artery disease (CAD). Therapy and prognosis are to a large extent dependent on LV function. Multi-slice computed tomography (MSCT) has already earned its place as an imaging modality for non-invasive assessment of the coronary arteries, but since retrospective gating to the patient's ECG is performed, information on LV function can be derived. In 49 patients with known or suspected CAD, coronary angiography with MSCT imaging was performed, in addition to gated SPECT and 2D echocardiography. LV end-diastolic and LV end-systolic volumes and LV ejection fraction were analysed with dedicated software (CMR Analytical Software System, Medis, Leiden, The Netherlands for MSCT; gated SPECT by QGS, Cedars-Sinai Medical Center, Los Angeles, CA, USA), and by the biplane Simpson's rule for 2D echocardiography. Regional wall motion was evaluated according to a 17-segment model and a three-point score system. Correlations were fairly good between gated SPECT and MSCT (LVEDV: r=0.65; LVESV: r=0.63; LVEF: r=0.60), and excellent between 2D echocardiography and MSCT (LVEDV: r=0.92; LVESV: r=0.93; LVEF: r=0.80). Agreement for regional wall motion was 95% (κ=0.66) between gated SPECT and MSCT, and 96% (κ=0.73) between 2D echocardiography and MSCT. Global and regional LV function and LV volumes can be adequately assessed with MSCT. Correlations with 2D echocardiography are stronger than with gated SPECT. (orig.)

  20. Development of computer-based function to estimate radioactive source term by coupling atmospheric model with monitoring data

    Akiko, Furuno; Hideyuki, Kitabata

    2003-01-01

    Full text: The importance of computer-based decision support systems for local and regional scale accidents has been recognized by many countries with the experiences of accidental atmospheric releases of radionuclides at Chernobyl in 1986 in the former Soviet Union. The recent increase of nuclear power plants in the Asian region also necessitates an emergency response system for Japan to predict the long-range atmospheric dispersion of radionuclides due to overseas accident. On the basis of these backgrounds, WSPEEDI (Worldwide version of System for Prediction of Environmental Emergency Dose Information) at Japan Atomic Energy Research Institute is developed to forecast long-range atmospheric dispersions of radionuclides during nuclear emergency. Although the source condition is critical parameter for accurate prediction, it is rarely that the condition can be acquired in the early stage of overseas accident. Thus, we have been developing a computer-based function to estimate radioactive source term, e.g. the release point, time and amount, as a part of WSPEEDI. This function consists of atmospheric transport simulations and statistical analysis for the prediction and monitoring of air dose rates. Atmospheric transport simulations are carried out for the matrix of possible release points in Eastern Asia and possible release times. The simulation results of air dose rates are compared with monitoring data and the best fitted release condition is defined as source term. This paper describes the source term estimation method and the application to Eastern Asia. The latest version of WSPEEDI accommodates following two models: an atmospheric meteorological model MM5 and a particle random walk model GEARN. MM5 is a non-hydrostatic meteorological model developed by the Pennsylvania State University and the National Center for Atmospheric Research (NCAR). MM5 physically calculates more than 40 meteorological parameters with high resolution in time and space based an

  1. Optimizing Functional Outcomes in Mandibular Condyle Reconstruction With the Free Fibula Flap Using Computer-Aided Design and Manufacturing Technology.

    Lee, Z-Hye; Avraham, Tomer; Monaco, Casian; Patel, Ashish A; Hirsch, David L; Levine, Jamie P

    2018-05-01

    Mandibular defects involving the condyle represent a complex reconstructive challenge for restoring proper function of the temporomandibular joint (TMJ) because it requires precise bone graft alignment for full restoration of joint function. The use of computer-aided design and manufacturing (CAD/CAM) technology can aid in accurate reconstruction of mandibular condyle defects with a vascularized free fibula flap without the need for additional adjuncts. The purpose of this study was to analyze clinical and functional outcomes after reconstruction of mandibular condyle defects using only a free fibula graft with the help of virtual surgery techniques. A retrospective review was performed to identify all patients who underwent mandibular reconstruction with only a free fibula flap without any TMJ adjuncts after a total condylectomy. Three-dimensional modeling software was used to plan and execute reconstruction for all patients. From 2009 through 2014, 14 patients underwent reconstruction of mandibular defects involving the condyle with the aid of virtual surgery technology. The average age was 38.7 years (range, 11 to 77 yr). The average follow-up period was 2.6 years (range, 0.8 to 4.2 yr). Flap survival was 100% (N = 14). All patients reported improved facial symmetry, adequate jaw opening, and normal dental occlusion. In addition, they achieved good functional outcomes, including normal intelligible speech and the tolerance of a regular diet with solid foods. Maximal interincisal opening range for all patients was 25 to 38 mm with no lateral deviation or subjective joint pain. No patient had progressive joint hypomobility or condylar migration. One patient had ankylosis, which required release. TMJ reconstruction poses considerable challenges in bone graft alignment for full restoration of joint function. The use of CAD/CAM technology can aid in accurate reconstruction of mandibular condyle defects with a vascularized free fibula flap through precise

  2. Noninvasive evaluation of global and regional left ventricular function using computed tomography and magnetic resonance imaging: a meta-analysis

    Kaniewska, Malwina; Schuetz, Georg M.; Willun, Steffen; Dewey, Marc; Schlattmann, Peter

    2017-01-01

    To compare the diagnostic accuracy of computed tomography (CT) in the assessment of global and regional left ventricular (LV) function with magnetic resonance imaging (MRI). MEDLINE, EMBASE and ISI Web of Science were systematically reviewed. Evaluation included: ejection fraction (EF), end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV) and left ventricular mass (LVM). Differences between modalities were analysed using limits of agreement (LoA). Publication bias was measured by Egger's regression test. Heterogeneity was evaluated using Cochran's Q test and Higgins I"2 statistic. In the presence of heterogeneity the DerSimonian-Laird method was used for estimation of heterogeneity variance. Fifty-three studies including 1,814 patients were identified. The mean difference between CT and MRI was -0.56 % (LoA, -11.6-10.5 %) for EF, 2.62 ml (-34.1-39.3 ml) for EDV and 1.61 ml (-22.4-25.7 ml) for ESV, 3.21 ml (-21.8-28.3 ml) for SV and 0.13 g (-28.2-28.4 g) for LVM. CT detected wall motion abnormalities on a per-segment basis with 90 % sensitivity and 97 % specificity. CT is accurate for assessing global LV function parameters but the limits of agreement versus MRI are moderately wide, while wall motion deficits are detected with high accuracy. (orig.)

  3. Functional analysis of rare variants in mismatch repair proteins augments results from computation-based predictive methods

    Arora, Sanjeevani; Huwe, Peter J.; Sikder, Rahmat; Shah, Manali; Browne, Amanda J.; Lesh, Randy; Nicolas, Emmanuelle; Deshpande, Sanat; Hall, Michael J.; Dunbrack, Roland L.; Golemis, Erica A.

    2017-01-01

    ABSTRACT The cancer-predisposing Lynch Syndrome (LS) arises from germline mutations in DNA mismatch repair (MMR) genes, predominantly MLH1, MSH2, MSH6, and PMS2. A major challenge for clinical diagnosis of LS is the frequent identification of variants of uncertain significance (VUS) in these genes, as it is often difficult to determine variant pathogenicity, particularly for missense variants. Generic programs such as SIFT and PolyPhen-2, and MMR gene-specific programs such as PON-MMR and MAPP-MMR, are often used to predict deleterious or neutral effects of VUS in MMR genes. We evaluated the performance of multiple predictive programs in the context of functional biologic data for 15 VUS in MLH1, MSH2, and PMS2. Using cell line models, we characterized VUS predicted to range from neutral to pathogenic on mRNA and protein expression, basal cellular viability, viability following treatment with a panel of DNA-damaging agents, and functionality in DNA damage response (DDR) signaling, benchmarking to wild-type MMR proteins. Our results suggest that the MMR gene-specific classifiers do not always align with the experimental phenotypes related to DDR. Our study highlights the importance of complementary experimental and computational assessment to develop future predictors for the assessment of VUS. PMID:28494185

  4. Changes in functional brain organization and behavioral correlations after rehabilitative therapy using a brain-computer interface

    Brittany Mei Young

    2014-07-01

    Full Text Available This study aims to examine the changes in task-related brain activity induced by rehabilitative therapy using brain-computer interface (BCI technologies and whether these changes are relevant to functional gains achieved through the use of these therapies. Stroke patients with persistent upper-extremity motor deficits received interventional rehabilitation therapy using a closed-loop neurofeedback BCI device (n=8 or no therapy (n=6. Behavioral assessments using the Stroke Impact Scale, the Action Research Arm Test, and the Nine-Hole Peg Test as well as task-based fMRI scans were conducted before, during, after, and one month after therapy administration or at analogous intervals in the absence of therapy. Laterality Index (LI during finger tapping of each hand were calculated for each time point and assessed for correlation with behavioral outcomes. Brain activity during finger tapping of each hand shifted over the course of BCI therapy but not in the absence of therapy to greater involvement of the non-lesioned hemisphere (and lesser involvement of the stroke-lesioned hemisphere as measured by LI. Moreover, changes from baseline LI values during finger tapping of the impaired hand were correlated with gains in both objective and subjective behavioral measures. These findings suggest that the administration of interventional BCI therapy can induce differential changes in brain activity patterns between the lesioned and nonlesioned hemisphere and that these brain changes are associated with changes in specific motor functions.

  5. Analysis of Future Vehicle Energy Demand in China Based on a Gompertz Function Method and Computable General Equilibrium Model

    Tian Wu

    2014-11-01

    Full Text Available This paper presents a model for the projection of Chinese vehicle stocks and road vehicle energy demand through 2050 based on low-, medium-, and high-growth scenarios. To derive a gross-domestic product (GDP-dependent Gompertz function, Chinese GDP is estimated using a recursive dynamic Computable General Equilibrium (CGE model. The Gompertz function is estimated using historical data on vehicle development trends in North America, Pacific Rim and Europe to overcome the problem of insufficient long-running data on Chinese vehicle ownership. Results indicate that the number of projected vehicle stocks for 2050 is 300, 455 and 463 million for low-, medium-, and high-growth scenarios respectively. Furthermore, the growth in China’s vehicle stock will increase beyond the inflection point of Gompertz curve by 2020, but will not reach saturation point during the period 2014–2050. Of major road vehicle categories, cars are the largest energy consumers, followed by trucks and buses. Growth in Chinese vehicle demand is primarily determined by per capita GDP. Vehicle saturation levels solely influence the shape of the Gompertz curve and population growth weakly affects vehicle demand. Projected total energy consumption of road vehicles in 2050 is 380, 575 and 586 million tonnes of oil equivalent for each scenario.

  6. Noninvasive evaluation of global and regional left ventricular function using computed tomography and magnetic resonance imaging: a meta-analysis

    Kaniewska, Malwina; Schuetz, Georg M.; Willun, Steffen; Dewey, Marc [Charite - Universitaetsmedizin Berlin, Department of Radiology, Berlin (Germany); Schlattmann, Peter [Jena University Hospital, Department of Medical Statistics, Informatics and Documentation, Jena (Germany)

    2017-04-15

    To compare the diagnostic accuracy of computed tomography (CT) in the assessment of global and regional left ventricular (LV) function with magnetic resonance imaging (MRI). MEDLINE, EMBASE and ISI Web of Science were systematically reviewed. Evaluation included: ejection fraction (EF), end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV) and left ventricular mass (LVM). Differences between modalities were analysed using limits of agreement (LoA). Publication bias was measured by Egger's regression test. Heterogeneity was evaluated using Cochran's Q test and Higgins I{sup 2} statistic. In the presence of heterogeneity the DerSimonian-Laird method was used for estimation of heterogeneity variance. Fifty-three studies including 1,814 patients were identified. The mean difference between CT and MRI was -0.56 % (LoA, -11.6-10.5 %) for EF, 2.62 ml (-34.1-39.3 ml) for EDV and 1.61 ml (-22.4-25.7 ml) for ESV, 3.21 ml (-21.8-28.3 ml) for SV and 0.13 g (-28.2-28.4 g) for LVM. CT detected wall motion abnormalities on a per-segment basis with 90 % sensitivity and 97 % specificity. CT is accurate for assessing global LV function parameters but the limits of agreement versus MRI are moderately wide, while wall motion deficits are detected with high accuracy. (orig.)

  7. Proteins of unknown function in the Protein Data Bank (PDB): an inventory of true uncharacterized proteins and computational tools for their analysis.

    Nadzirin, Nurul; Firdaus-Raih, Mohd

    2012-10-08

    Proteins of uncharacterized functions form a large part of many of the currently available biological databases and this situation exists even in the Protein Data Bank (PDB). Our analysis of recent PDB data revealed that only 42.53% of PDB entries (1084 coordinate files) that were categorized under "unknown function" are true examples of proteins of unknown function at this point in time. The remainder 1465 entries also annotated as such appear to be able to have their annotations re-assessed, based on the availability of direct functional characterization experiments for the protein itself, or for homologous sequences or structures thus enabling computational function inference.

  8. Scatter kernel estimation with an edge-spread function method for cone-beam computed tomography imaging

    Li Heng; Mohan, Radhe; Zhu, X Ronald

    2008-01-01

    The clinical applications of kilovoltage x-ray cone-beam computed tomography (CBCT) have been compromised by the limited quality of CBCT images, which typically is due to a substantial scatter component in the projection data. In this paper, we describe an experimental method of deriving the scatter kernel of a CBCT imaging system. The estimated scatter kernel can be used to remove the scatter component from the CBCT projection images, thus improving the quality of the reconstructed image. The scattered radiation was approximated as depth-dependent, pencil-beam kernels, which were derived using an edge-spread function (ESF) method. The ESF geometry was achieved with a half-beam block created by a 3 mm thick lead sheet placed on a stack of slab solid-water phantoms. Measurements for ten water-equivalent thicknesses (WET) ranging from 0 cm to 41 cm were taken with (half-blocked) and without (unblocked) the lead sheet, and corresponding pencil-beam scatter kernels or point-spread functions (PSFs) were then derived without assuming any empirical trial function. The derived scatter kernels were verified with phantom studies. Scatter correction was then incorporated into the reconstruction process to improve image quality. For a 32 cm diameter cylinder phantom, the flatness of the reconstructed image was improved from 22% to 5%. When the method was applied to CBCT images for patients undergoing image-guided therapy of the pelvis and lung, the variation in selected regions of interest (ROIs) was reduced from >300 HU to <100 HU. We conclude that the scatter reduction technique utilizing the scatter kernel effectively suppresses the artifact caused by scatter in CBCT.

  9. Computation-Facilitated Assignment of Function in the Enolase Superfamily: A Regiochemically Distinct Galactarate Dehydratase from Oceanobacillus iheyensis†

    Rakus, John F.; Kalyanaraman, Chakrapani; Fedorov, Alexander A.; Fedorov, Elena V.; Mills-Groninger, Fiona P.; Toro, Rafael; Bonanno, Jeffrey; Bain, Kevin; Sauder, J. Michael; Burley, Stephen K.; Almo, Steven C.; Jacobson, Matthew P.; Gerlt, John A.

    2009-01-01

    The structure of an uncharacterized member of the enolase superfamily from Oceanobacillus iheyensis (GI: 23100298; IMG locus tag Ob2843; PDB Code 2OQY) was determined by the New York SGX Research Center for Structural Genomics (NYSGXRC). The structure contained two Mg2+ ions located 10.4 Å from one another, with one located in the canonical position in the (β/α)7β-barrel domain (although the ligand at the end of the fifth β-strand is His, unprecedented in structurally characterized members of the superfamily); the second is located in a novel site within the capping domain. In silico docking of a library of mono- and diacid sugars to the active site predicted a diacid sugar as a likely substrate. Activity screening of a physical library of acid sugars identified galactarate as the substrate (kcat = 6.8 s−1, KM = 620 μM; kcat/KM = 1.1 × 104 M−1 s−1), allowing functional assignment of Ob2843 as galactarate dehydratase (GalrD-II) The structure of a complex of the catalytically impaired Y90F mutant with Mg2+ and galactarate allowed identification of a Tyr 164-Arg 162 dyad as the base that initiates the reaction by abstraction of the α-proton and Tyr 90 as the acid that facilitates departure of the β-OH leaving group. The enzyme product is 2-keto-3-deoxy-D-threo-4,5-dihydroxyadipate, the enantiomer of the product obtained in the GalrD reaction catalyzed by a previously characterized bifunctional L-talarate/galactarate dehydratase (TalrD/GalrD). On the basis of the different active site structures and different regiochemistries, we recognize that these functions represent an example of apparent, not actual, convergent evolution of function. The structure of GalrD-II and its active site architecture allow identification of the seventh functionally and structurally characterized subgroup in the enolase superfamily. This study provides an additional example that an integrated sequence/structure-based strategy employing computational approaches is a viable

  10. COGNITIVE COMPUTER GRAPHICS AS A MEANS OF "SOFT" MODELING IN PROBLEMS OF RESTORATION OF FUNCTIONS OF TWO VARIABLES

    A.N. Khomchenko

    2016-08-01

    Full Text Available The paper considers the problem of bi-cubic interpolation on the final element of serendipity family. With cognitive-graphical analysis the rigid model of Ergatoudis, Irons and Zenkevich (1968 compared with alternative models, obtained by the methods: direct geometric design, a weighted averaging of the basis polynomials, systematic generation of bases (advanced Taylor procedure. The emphasis is placed on the phenomenon of "gravitational repulsion" (Zenkevich paradox. The causes of rising of inadequate physical spectra nodal loads on serendipity elements of higher orders are investigated. Soft modeling allows us to build a lot of serendipity elements of bicubic interpolation, and you do not even need to know the exact form of the rigid model. The different interpretations of integral characteristics of the basis polynomials: geometrical, physical, probability are offered. Under the soft model in the theory of interpolation of function of two variables implies the model amenable to change through the choice of basis. Such changes in the family of Lagrangian finite elements of higher orders are excluded (hard simulation. Standard models of serendipity family (Zenkevich were also tough. It was found that the "responsibility" for the rigidity of serendipity model rests on ruled surfaces (zero Gaussian curvature - conoids that predominate in the base set. Cognitive portraits zero lines of standard serendipity surfaces suggested that in order to "mitigate" of serendipity pattern conoid should better be replaced by surfaces of alternating Gaussian curvature. The article shows the alternative (soft bases of serendipity models. The work is devoted to solving scientific and technological problems aimed at the creation, dissemination and use of cognitive computer graphics in teaching and learning. The results are of interest to students of specialties: "Computer Science and Information Technologies", "System Analysis", "Software Engineering", as well as

  11. A comparative approach for the investigation of biological information processing: An examination of the structure and function of computer hard drives and DNA

    D'Onofrio, David J; An, Gary

    2010-01-01

    Abstract Background The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these pr...

  12. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions

    Jha, Abhinav K.; Barrett, Harrison H.; Frey, Eric C.; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A.

    2015-09-01

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and

  13. Singular value decomposition for photon-processing nuclear imaging systems and applications for reconstruction and computing null functions.

    Jha, Abhinav K; Barrett, Harrison H; Frey, Eric C; Clarkson, Eric; Caucci, Luca; Kupinski, Matthew A

    2015-09-21

    Recent advances in technology are enabling a new class of nuclear imaging systems consisting of detectors that use real-time maximum-likelihood (ML) methods to estimate the interaction position, deposited energy, and other attributes of each photon-interaction event and store these attributes in a list format. This class of systems, which we refer to as photon-processing (PP) nuclear imaging systems, can be described by a fundamentally different mathematical imaging operator that allows processing of the continuous-valued photon attributes on a per-photon basis. Unlike conventional photon-counting (PC) systems that bin the data into images, PP systems do not have any binning-related information loss. Mathematically, while PC systems have an infinite-dimensional null space due to dimensionality considerations, PP systems do not necessarily suffer from this issue. Therefore, PP systems have the potential to provide improved performance in comparison to PC systems. To study these advantages, we propose a framework to perform the singular-value decomposition (SVD) of the PP imaging operator. We use this framework to perform the SVD of operators that describe a general two-dimensional (2D) planar linear shift-invariant (LSIV) PP system and a hypothetical continuously rotating 2D single-photon emission computed tomography (SPECT) PP system. We then discuss two applications of the SVD framework. The first application is to decompose the object being imaged by the PP imaging system into measurement and null components. We compare these components to the measurement and null components obtained with PC systems. In the process, we also present a procedure to compute the null functions for a PC system. The second application is designing analytical reconstruction algorithms for PP systems. The proposed analytical approach exploits the fact that PP systems acquire data in a continuous domain to estimate a continuous object function. The approach is parallelizable and

  14. Report on evaluation of research and development of superhigh-function electronic computers; Chokoseino denshi keisanki no kenkyu kaihatsu ni kansuru hyoka hokokusho

    NONE

    1973-02-20

    Described herein is development of superhigh-function electronic computers.This project was implemented on a 6-year joint project, beginning in FY 1966, by the government, industrial and academic circles, with the objective to develop standard, large-size computers comparable with those of the world's highest functions by the beginning of the 70's. The computers developed by this project met almost all of the specifications of the world's representative, large-size commercial computers, partly surpassing the world's machine. In particular, integration of the virtual memory, buffer memory and multi-processor functions, which were considered to be the central technical features of the computers of the next generation, into one system was a Japan's unique concept, not seen in other countries. The other developments considered to have great ripple effects are seen in LSI's, and techniques for utilizing and mounting them and for improving their reliability. Development of magnetic discs is another notable result for the peripheral devices. Development of the input/output devices was started to correspond to inputting, outputting and reading Chinese characters, which are characteristics of Japan. The software developed has sufficient functions for common use and is considered to be the world's leading, large-size operating system, although evaluation thereof largely awaits the actual specification results. (NEDO)

  15. Functional imaging using computer methods to compare the effect of salbutamol and ipratropium bromide in patient-specific airway models of COPD

    De Backer LA

    2011-11-01

    Full Text Available LA De Backer1, WG Vos2, R Salgado3, JW De Backer2, A Devolder1, SL Verhulst1, R Claes1, PR Germonpré1, WA De Backer11Department of Respiratory Medicine, 2FluidDA, 3Department of Radiology, Antwerp University Hospital, Antwerp, BelgiumBackground: Salbutamol and ipratropium bromide improve lung function in patients with chronic obstructive pulmonary disease (COPD. However, their bronchodilating effect has not yet been compared in the central and distal airways. Functional imaging using computational fluid dynamics offers the possibility of making such a comparison. The objective of this study was to assess the effects of salbutamol and ipratropium bromide on the geometry and computational fluid dynamics-based resistance of the central and distal airways.Methods: Five patients with Global Initiative for Chronic Obstructive Lung Disease Stage III COPD were randomized to a single dose of salbutamol or ipratropium bromide in a crossover manner with a 1-week interval between treatments. Patients underwent lung function testing and a multislice computed tomography scan of the thorax that was used for functional imaging. Two hours after dosing, the patients again underwent lung function tests and repeat computed tomography.Results: Lung function parameters, including forced expiratory volume in 1 second, vital capacity, overall airway resistance, and specific airway resistance, changed significantly after administration of each product. On functional imaging, the bronchodilating effect was greater in the distal airways, with a corresponding drop in airway resistance, compared with the central airways. Salbutamol and ipratropium bromide were equally effective at first glance when looking at lung function tests, but when viewed in more detail with functional imaging, hyporesponsiveness could be shown for salbutamol in one patient. Salbutamol was more effective in the other patients.Conclusion: This pilot study gives an innovative insight into the modes of

  16. Assessment of global left ventricular function with dual-source computed tomography in patients with valvular heart disease

    Bak, So Hyeon; Jeon, Hae Jeong (Dept. of Radiology, Konkuk Univ. Hospital, Konkuk Univ. School of Medicine, Seoul (Korea, Republic of)); Ko, Sung Min (Dept. of Radiology, Konkuk Univ. Hospital, Konkuk Univ. School of Medicine, Seoul (Korea, Republic of); Research Inst. of Medical Science, Konkuk Univ. Hospital, Konkuk Univ. School of Medicine, Seoul (Korea, Republic of)), Email: 20070437@kuh.ac.kr; Yang, Hyun Suk; Hwang, Hweung Kon (Dept. of Cardiology, Konkuk Univ. Hospital, Konkuk Univ. School of Medicine, Seoul (Korea, Republic of)); Song, Meong Gun (Dept. of Thoracic Surgery, Konkuk Univ. Hospital, Konkuk Univ. School of Medicine, Seoul (Korea, Republic of))

    2012-04-15

    Background: Left ventricular (LV) function is a vital parameter for prognosis, therapy guidance, and follow-up of cardiovascular disease. Dual-source computed tomography (DSCT) provides an accurate analysis of global LV function. Purpose: To assess the performance of DSCT in the determination of global LV functional parameters in comparison with cardiovascular magnetic resonance (CMR) and two-dimensional transthoracic echocardiography (2D-TTE) in patients with valvular heart disease (VHD). Material and Methods: A total of 111 patients (58 men, mean age 49.9 years) with known VHD and who underwent DSCT, 2D-TTE, and CMR a period of 2 weeks before undergoing valve surgery were included in this study. LV end-systolic volume (ESV), end-diastolic volume (EDV), stroke volume (SV), and ejection fraction (EF) were calculated by DSCT using the threshold-based technique, by 2D-TTE using a modified Simpson's method, and by CMR using Simpson's method. Agreement for parameters of LV global function was determined with the Pearson's correlation coefficient (r) and Bland-Altman analysis. All the DSCT and CMR data-sets were assessed independently by two readers. Results: Fifty of the total 111 patients had aortic VHD, 29 patients had mitral VHD, and 32 patients had mixed aortic and mitral VHD. An excellent inter-observer agreement was seen for the assessment of global LV function using DSCT (r 0.910-0.983) and CMR (r = 0.854-0.965). An excellent or good correlation (r 0.93, 0.95, 0.87, and 0.71, respectively, P < 0.001) was noted between the DSCT and 2D-TTE values for EDV, ESV, SV, and EF. EDV (33.7 mL, P < 0.001), ESV (12.1 mL, P < 0.001), SV (21.2 mL, P < 0.001), and EF (1.6%, P = 0.019) were significantly overestimated by DSCT when compared with 2D-TTE. An excellent correlation (r = 0.96, 0.97, 0.91, and 0.94, respectively, P < 0.001) between DSCT and CMR was seen in the evaluation of EDV, ESV, SV, and EF. EDV (15.9 mL, P < 0.001), ESV (7.3 mL, P < 0.001), and SV

  17. Assessment of global left ventricular function with dual-source computed tomography in patients with valvular heart disease

    Bak, So Hyeon; Jeon, Hae Jeong; Ko, Sung Min; Yang, Hyun Suk; Hwang, Hweung Kon; Song, Meong Gun

    2012-01-01

    Background: Left ventricular (LV) function is a vital parameter for prognosis, therapy guidance, and follow-up of cardiovascular disease. Dual-source computed tomography (DSCT) provides an accurate analysis of global LV function. Purpose: To assess the performance of DSCT in the determination of global LV functional parameters in comparison with cardiovascular magnetic resonance (CMR) and two-dimensional transthoracic echocardiography (2D-TTE) in patients with valvular heart disease (VHD). Material and Methods: A total of 111 patients (58 men, mean age 49.9 years) with known VHD and who underwent DSCT, 2D-TTE, and CMR a period of 2 weeks before undergoing valve surgery were included in this study. LV end-systolic volume (ESV), end-diastolic volume (EDV), stroke volume (SV), and ejection fraction (EF) were calculated by DSCT using the threshold-based technique, by 2D-TTE using a modified Simpson's method, and by CMR using Simpson's method. Agreement for parameters of LV global function was determined with the Pearson's correlation coefficient (r) and Bland-Altman analysis. All the DSCT and CMR data-sets were assessed independently by two readers. Results: Fifty of the total 111 patients had aortic VHD, 29 patients had mitral VHD, and 32 patients had mixed aortic and mitral VHD. An excellent inter-observer agreement was seen for the assessment of global LV function using DSCT (r 0.910-0.983) and CMR (r = 0.854-0.965). An excellent or good correlation (r 0.93, 0.95, 0.87, and 0.71, respectively, P < 0.001) was noted between the DSCT and 2D-TTE values for EDV, ESV, SV, and EF. EDV (33.7 mL, P < 0.001), ESV (12.1 mL, P < 0.001), SV (21.2 mL, P < 0.001), and EF (1.6%, P = 0.019) were significantly overestimated by DSCT when compared with 2D-TTE. An excellent correlation (r = 0.96, 0.97, 0.91, and 0.94, respectively, P < 0.001) between DSCT and CMR was seen in the evaluation of EDV, ESV, SV, and EF. EDV (15.9 mL, P < 0.001), ESV (7.3 mL, P < 0.001), and SV (8.5 mL, P < 0

  18. Bold-Independent Computational Entropy Assesses Functional Donut-Like Structures in Brain fMRI Images.

    Peters, James F; Ramanna, Sheela; Tozzi, Arturo; İnan, Ebubekir

    2017-01-01

    We introduce a novel method for the measurement of information level in fMRI (functional Magnetic Resonance Imaging) neural data sets, based on image subdivision in small polygons equipped with different entropic content. We show how this method, called maximal nucleus clustering (MNC), is a novel, fast and inexpensive image-analysis technique, independent from the standard blood-oxygen-level dependent signals. MNC facilitates the objective detection of hidden temporal patterns of entropy/information in zones of fMRI images generally not taken into account by the subjective standpoint of the observer. This approach befits the geometric character of fMRIs. The main purpose of this study is to provide a computable framework for fMRI that not only facilitates analyses, but also provides an easily decipherable visualization of structures. This framework commands attention because it is easily implemented using conventional software systems. In order to evaluate the potential applications of MNC, we looked for the presence of a fourth dimension's distinctive hallmarks in a temporal sequence of 2D images taken during spontaneous brain activity. Indeed, recent findings suggest that several brain activities, such as mind-wandering and memory retrieval, might take place in the functional space of a four dimensional hypersphere, which is a double donut-like structure undetectable in the usual three dimensions. We found that the Rényi entropy is higher in MNC areas than in the surrounding ones, and that these temporal patterns closely resemble the trajectories predicted by the possible presence of a hypersphere in the brain.

  19. Renal cortical volume measured using automatic contouring software for computed tomography and its relationship with BMI, age and renal function

    Muto, Natalia Sayuri; Kamishima, Tamotsu; Harris, Ardene A.; Kato, Fumi; Onodera, Yuya; Terae, Satoshi; Shirato, Hiroki

    2011-01-01

    Purpose: To evaluate the relationship between renal cortical volume, measured by an automatic contouring software, with body mass index (BMI), age and renal function. Materials and methods: The study was performed in accordance to the institutional guidelines at our hospital. Sixty-four patients (34 men, 30 women), aged 19 to 79 years had their CT scans for diagnosis or follow-up of hepatocellular carcinoma retrospectively examined by a computer workstation using a software that automatically contours the renal cortex and the renal parenchyma. Body mass index and estimated glomerular filtration rate (eGFR) were calculated based on data collected. Statistical analysis was done using the Student t-test, multiple regression analysis, and intraclass correlation coefficient (ICC). Results: The ICC for total renal and renal cortical volumes were 0.98 and 0.99, respectively. Renal volume measurements yielded a mean cortical volume of 105.8 cm 3 ± 28.4 SD, mean total volume of 153 cm 3 ± 39 SD and mean medullary volume of 47.8 cm 3 ± 19.5 SD. The correlation between body weight/height/BMI and both total renal and cortical volumes presented r = 0.6, 0.6 and 0.4, respectively, p < 0.05, while the correlation between renal cortex and age was r = -0.3, p < 0.05. eGFR showed correlation with renal cortical volume r = 0.6, p < 0.05. Conclusion: This study demonstrated that renal cortical volume had a moderate positive relationship with BMI, moderate negative relationship with age, and a strong positive relationship with the renal function, and provided a new method to routinely produce volumetric assessment of the kidney.

  20. Computed tomography airway lumen volumetry in patients with acromegaly: Association with growth hormone levels and lung function.

    Camilo, Gustavo Bittencourt; Carvalho, Alysson Roncally Silva; Guimarães, Alan Ranieri Medeiros; Kasuki, Leandro; Gadelha, Mônica Roberto; Mogami, Roberto; de Melo, Pedro Lopes; Lopes, Agnaldo José

    2017-10-01

    The segmentation and skeletonisation of images via computed tomography (CT) airway lumen volumetry provide a new perspective regarding the incorporation of this technique in medical practice. Our aim was to quantify morphological changes in the large airways of patients with acromegaly through CT and, secondarily, to correlate these findings with hormone levels and pulmonary function testing (PFT) parameters. This was a cross-sectional study in which 28 non-smoker patients with acromegaly and 15 control subjects underwent CT analysis of airway lumen volumetry with subsequent image segmentation and skeletonisation. Moreover, all participants were subjected to PFT. Compared with the controls, patients with acromegaly presented higher diameters in the trachea, right main bronchus and left main bronchus. The patients with acromegaly also showed a higher tracheal sinuosity index (the deviation of a line from the shortest path, calculated by dividing total length by shortest possible path) than the controls [1.06 (1.02-1.09) vs. 1.03 (1.02-1.04), P = 0.04], and tracheal stenosis was observed in 25% of these individuals. The tracheal area was correlated with the levels of growth hormone (r s  = 0.45, P = 0.02) and insulin-like growth factor type I (r s  = 0.38, P = 0.04). The ratio between the forced expiratory flow and forced inspiratory flow at 50% of the forced vital capacity was correlated with the tracheal area (r s  = 0.36, P = 0.02) and Δ tracheal diameters (r s  = 0.58, P volumetry, hormone levels and functional parameters of large airway obstruction. © 2017 The Royal Australian and New Zealand College of Radiologists.